Apr 24 14:21:52.060071 ip-10-0-143-92 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 14:21:52.060086 ip-10-0-143-92 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 14:21:52.060096 ip-10-0-143-92 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 14:21:52.060451 ip-10-0-143-92 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 14:22:02.239815 ip-10-0-143-92 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 14:22:02.239831 ip-10-0-143-92 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b458b35d5f7140a78fc733b4d334db8d -- Apr 24 14:24:32.433234 ip-10-0-143-92 systemd[1]: Starting Kubernetes Kubelet... Apr 24 14:24:32.919019 ip-10-0-143-92 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:32.919019 ip-10-0-143-92 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 14:24:32.919019 ip-10-0-143-92 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:32.919019 ip-10-0-143-92 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 14:24:32.919019 ip-10-0-143-92 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:32.921129 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.921028 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 14:24:32.924150 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924131 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:32.924150 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924148 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:32.924150 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924152 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924156 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924159 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924162 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924165 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924169 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924172 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924174 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924178 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924181 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924184 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924186 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924190 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924193 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924195 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924198 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924201 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924203 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924206 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:32.924250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924209 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924211 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924214 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924217 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924220 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924223 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924225 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924229 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924234 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924237 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924239 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924242 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924245 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924248 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924252 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924255 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924258 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924261 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924264 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:32.924731 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924267 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924270 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924273 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924276 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924279 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924282 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924284 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924287 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924289 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924292 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924295 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924297 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924300 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924303 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924305 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924308 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924311 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924314 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924317 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924319 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:32.925272 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924322 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924325 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924327 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924330 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924332 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924335 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924338 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924341 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924344 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924347 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924349 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924352 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924357 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924361 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924364 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924367 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924369 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924373 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924383 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:32.925767 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924385 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924388 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924391 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924393 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924396 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924399 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924402 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924830 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924836 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924840 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924842 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924845 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924848 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924851 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924853 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924856 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924859 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924861 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924864 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924867 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:32.926239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924870 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924873 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924876 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924892 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924895 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924900 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924904 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924907 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924910 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924913 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924916 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924919 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924923 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924925 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924928 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924931 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924934 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924936 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924939 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:32.926745 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924942 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924945 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924947 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924950 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924952 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924955 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924957 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924960 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924963 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924966 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924968 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924971 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924974 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924977 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924979 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924983 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924986 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924989 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924991 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924994 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:32.927232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924996 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.924999 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925002 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925005 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925007 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925010 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925012 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925015 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925017 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925020 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925023 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925025 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925028 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925031 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925033 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925036 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925038 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925041 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925045 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:32.927740 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925049 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925052 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925056 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925059 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925062 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925064 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925067 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925070 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925073 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925076 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925079 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925081 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925084 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925087 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925089 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925166 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925178 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925185 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925190 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925195 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925198 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925203 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 14:24:32.928228 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925210 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925213 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925217 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925221 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925224 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925228 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925231 2574 flags.go:64] FLAG: --cgroup-root="" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925234 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925237 2574 flags.go:64] FLAG: --client-ca-file="" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925240 2574 flags.go:64] FLAG: --cloud-config="" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925243 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925246 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925251 2574 flags.go:64] FLAG: --cluster-domain="" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925254 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925257 2574 flags.go:64] FLAG: --config-dir="" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925260 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925263 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925267 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925271 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925274 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925278 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925281 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925285 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925288 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925291 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 14:24:32.928760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925295 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925299 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925303 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925306 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925309 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925312 2574 flags.go:64] FLAG: --enable-server="true" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925315 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925319 2574 flags.go:64] FLAG: --event-burst="100" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925323 2574 flags.go:64] FLAG: --event-qps="50" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925326 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925329 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925332 2574 flags.go:64] FLAG: --eviction-hard="" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925336 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925339 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925343 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925346 2574 flags.go:64] FLAG: --eviction-soft="" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925349 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925352 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925355 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925358 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925361 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925364 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925367 2574 flags.go:64] FLAG: --feature-gates="" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925371 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925374 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 14:24:32.929473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925377 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925381 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925385 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925388 2574 flags.go:64] FLAG: --help="false" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925391 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-143-92.ec2.internal" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925394 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925398 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925401 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925405 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925408 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925411 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925414 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925417 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925420 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925423 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925427 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925430 2574 flags.go:64] FLAG: --kube-reserved="" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925433 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925436 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925439 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925443 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925446 2574 flags.go:64] FLAG: --lock-file="" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925448 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925451 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 14:24:32.930085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925454 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925460 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925463 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925466 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925469 2574 flags.go:64] FLAG: --logging-format="text" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925471 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925475 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925478 2574 flags.go:64] FLAG: --manifest-url="" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925481 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925485 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925489 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925494 2574 flags.go:64] FLAG: --max-pods="110" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925497 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925501 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925504 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925507 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925513 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925516 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925520 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925530 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925533 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925536 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925539 2574 flags.go:64] FLAG: --pod-cidr="" Apr 24 14:24:32.930676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925542 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925548 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925552 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925555 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925558 2574 flags.go:64] FLAG: --port="10250" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925562 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925565 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09f1daf98b41149cd" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925568 2574 flags.go:64] FLAG: --qos-reserved="" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925571 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925574 2574 flags.go:64] FLAG: --register-node="true" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925577 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925580 2574 flags.go:64] FLAG: --register-with-taints="" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925584 2574 flags.go:64] FLAG: --registry-burst="10" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925587 2574 flags.go:64] FLAG: --registry-qps="5" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925590 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925593 2574 flags.go:64] FLAG: --reserved-memory="" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925597 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925600 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925604 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925607 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925610 2574 flags.go:64] FLAG: --runonce="false" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925614 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925617 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925620 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925623 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925627 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 14:24:32.931255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925630 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925633 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925636 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925641 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925644 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925647 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925650 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925653 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925656 2574 flags.go:64] FLAG: --system-cgroups="" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925659 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925665 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925668 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925671 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925676 2574 flags.go:64] FLAG: --tls-min-version="" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925679 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925682 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925684 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925687 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925690 2574 flags.go:64] FLAG: --v="2" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925695 2574 flags.go:64] FLAG: --version="false" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925699 2574 flags.go:64] FLAG: --vmodule="" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925704 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.925708 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925808 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:32.932078 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925813 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925816 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925819 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925822 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925825 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925827 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925830 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925834 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925836 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925839 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925841 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925845 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925847 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925850 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925853 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925855 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925858 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925860 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925863 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:32.933084 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925867 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925870 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925874 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925890 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925893 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925895 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925898 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925901 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925904 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925907 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925909 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925912 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925914 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925917 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925919 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925922 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925924 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925927 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925930 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925933 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:32.933936 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925937 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925940 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925942 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925945 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925949 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925952 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925954 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925957 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925960 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925963 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925965 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925968 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925970 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925973 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925976 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925978 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925981 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925983 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925986 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925989 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:32.934824 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925991 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925994 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925996 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.925999 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926001 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926004 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926006 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926009 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926012 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926014 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926017 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926020 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926025 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926029 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926032 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926035 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926038 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926041 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926044 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:32.935625 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926047 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926051 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926053 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926056 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926059 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926062 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.926065 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.926074 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.934057 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.934083 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934162 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934170 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934175 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934181 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934186 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934190 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:32.936212 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934195 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934200 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934204 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934209 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934213 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934218 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934222 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934226 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934231 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934235 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934240 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934248 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934254 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934260 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934264 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934268 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934273 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934277 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934282 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:32.936694 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934286 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934291 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934295 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934299 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934303 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934307 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934322 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934327 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934331 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934335 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934340 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934344 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934349 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934353 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934357 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934361 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934365 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934370 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934375 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934379 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:32.937475 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934383 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934387 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934392 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934396 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934400 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934405 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934409 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934413 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934418 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934423 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934427 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934431 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934435 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934439 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934444 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934448 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934452 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934456 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934462 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934475 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:32.938347 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934480 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934484 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934489 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934493 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934498 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934502 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934506 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934511 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934515 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934519 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934523 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934528 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934532 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934536 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934542 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934546 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934551 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934555 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934559 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:32.938871 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934566 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934571 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.934580 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934780 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934789 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934794 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934799 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934803 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934807 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934812 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934816 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934821 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934826 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934838 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934843 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934846 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:32.939570 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934851 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934855 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934859 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934863 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934869 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934892 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934898 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934902 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934906 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934910 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934915 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934920 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934926 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934931 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934936 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934941 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934946 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934950 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934955 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:32.939999 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934960 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934964 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934970 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934975 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934980 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934984 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934989 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934993 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.934998 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935003 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935008 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935021 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935026 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935030 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935034 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935039 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935043 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935047 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935052 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935056 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:32.940510 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935061 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935065 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935069 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935074 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935078 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935091 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935097 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935101 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935105 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935109 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935114 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935118 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935122 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935127 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935131 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935135 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935139 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935144 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935149 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935153 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:32.941046 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935157 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935162 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935166 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935171 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935184 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935190 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935194 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935199 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935203 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935208 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935212 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935216 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935220 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:32.935225 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.935234 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:32.941550 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.936111 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 14:24:32.941992 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.938982 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 14:24:32.941992 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.940213 2574 server.go:1019] "Starting client certificate rotation" Apr 24 14:24:32.941992 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.940315 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:32.941992 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.940361 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:32.975743 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.975709 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:32.977595 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.977571 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:32.996010 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:32.995987 2574 log.go:25] "Validated CRI v1 runtime API" Apr 24 14:24:33.001997 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.001974 2574 log.go:25] "Validated CRI v1 image API" Apr 24 14:24:33.003435 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.003417 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 14:24:33.005248 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.005223 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:33.006598 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.006575 2574 fs.go:135] Filesystem UUIDs: map[0d0ead2e-552d-40b1-9422-28679a222de3:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 ecda10e1-63a4-4e40-805a-82dfb1fadcc9:/dev/nvme0n1p3] Apr 24 14:24:33.006690 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.006597 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 14:24:33.013480 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.013166 2574 manager.go:217] Machine: {Timestamp:2026-04-24 14:24:33.010766236 +0000 UTC m=+0.449283099 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099042 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2531391effe8f38e2bb06bddbbf107 SystemUUID:ec253139-1eff-e8f3-8e2b-b06bddbbf107 BootID:b458b35d-5f71-40a7-8fc7-33b4d334db8d Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:80:c4:90:c1:e1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:80:c4:90:c1:e1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ba:cc:db:3a:8b:46 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 14:24:33.014131 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.014120 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 14:24:33.014259 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.014247 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 14:24:33.017133 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.016961 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 14:24:33.017282 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.017138 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-92.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 14:24:33.017334 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.017294 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 14:24:33.017334 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.017304 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 14:24:33.017334 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.017318 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:33.018050 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.018038 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:33.019404 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.019392 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:33.019523 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.019515 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 14:24:33.021789 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.021777 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 24 14:24:33.021829 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.021795 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 14:24:33.021829 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.021808 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 14:24:33.021829 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.021818 2574 kubelet.go:397] "Adding apiserver pod source" Apr 24 14:24:33.021829 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.021828 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 14:24:33.023539 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.023522 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:33.023539 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.023542 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:33.026735 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.026719 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 14:24:33.028841 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.028827 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 14:24:33.029867 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.029847 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ll6zb" Apr 24 14:24:33.030275 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030259 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 14:24:33.030336 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030284 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 14:24:33.030336 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030293 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 14:24:33.030336 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030302 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 14:24:33.030336 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030312 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 14:24:33.030336 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030320 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 14:24:33.030336 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030329 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 14:24:33.030336 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030337 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 14:24:33.030520 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030349 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 14:24:33.030520 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030358 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 14:24:33.030520 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030370 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 14:24:33.030520 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.030383 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 14:24:33.031528 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.031508 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 14:24:33.031528 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.031515 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-92.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 14:24:33.031675 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.031664 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 14:24:33.031709 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.031677 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 14:24:33.035313 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.035297 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-92.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 14:24:33.035491 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.035479 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 14:24:33.035543 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.035521 2574 server.go:1295] "Started kubelet" Apr 24 14:24:33.035638 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.035596 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 14:24:33.035710 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.035664 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 14:24:33.035762 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.035735 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 14:24:33.036671 ip-10-0-143-92 systemd[1]: Started Kubernetes Kubelet. Apr 24 14:24:33.037429 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.037338 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 14:24:33.038106 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.038087 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ll6zb" Apr 24 14:24:33.038500 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.038482 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 24 14:24:33.042252 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.041278 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-92.ec2.internal.18a95113b3b39a3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-92.ec2.internal,UID:ip-10-0-143-92.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-92.ec2.internal,},FirstTimestamp:2026-04-24 14:24:33.035491899 +0000 UTC m=+0.474008750,LastTimestamp:2026-04-24 14:24:33.035491899 +0000 UTC m=+0.474008750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-92.ec2.internal,}" Apr 24 14:24:33.043928 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.043909 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 14:24:33.044683 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.044668 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:33.045199 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.045178 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 14:24:33.045860 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.045839 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 14:24:33.045860 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.045843 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 14:24:33.046020 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.045869 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 14:24:33.046020 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.045850 2574 factory.go:55] Registering systemd factory Apr 24 14:24:33.046020 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.045909 2574 factory.go:223] Registration of the systemd container factory successfully Apr 24 14:24:33.046158 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.046040 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 24 14:24:33.046158 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.046052 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 24 14:24:33.046335 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.046258 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.046542 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.046524 2574 factory.go:153] Registering CRI-O factory Apr 24 14:24:33.046542 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.046545 2574 factory.go:223] Registration of the crio container factory successfully Apr 24 14:24:33.046850 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.046835 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 14:24:33.046994 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.046982 2574 factory.go:103] Registering Raw factory Apr 24 14:24:33.047065 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.047051 2574 manager.go:1196] Started watching for new ooms in manager Apr 24 14:24:33.047431 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.047416 2574 manager.go:319] Starting recovery of all containers Apr 24 14:24:33.048187 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.048165 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:33.054057 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.054029 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-92.ec2.internal\" not found" node="ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.059057 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.059035 2574 manager.go:324] Recovery completed Apr 24 14:24:33.063413 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.063400 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:33.065824 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.065809 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:33.065915 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.065836 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:33.065915 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.065852 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:33.066423 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.066412 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 14:24:33.066423 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.066422 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 14:24:33.066487 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.066440 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:33.068924 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.068912 2574 policy_none.go:49] "None policy: Start" Apr 24 14:24:33.068968 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.068929 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 14:24:33.068968 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.068939 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 24 14:24:33.098735 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.098717 2574 manager.go:341] "Starting Device Plugin manager" Apr 24 14:24:33.098857 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.098830 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 14:24:33.098857 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.098844 2574 server.go:85] "Starting device plugin registration server" Apr 24 14:24:33.099136 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.099122 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 14:24:33.099206 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.099137 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 14:24:33.099283 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.099268 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 14:24:33.099366 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.099347 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 14:24:33.099366 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.099359 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 14:24:33.099924 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.099864 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 14:24:33.100006 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.099946 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.163772 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.163708 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 14:24:33.165857 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.165028 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 14:24:33.165857 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.165058 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 14:24:33.165857 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.165089 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 14:24:33.165857 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.165101 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 14:24:33.165857 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.165149 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 14:24:33.169700 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.169645 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:33.199939 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.199905 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:33.201214 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.201192 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:33.201273 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.201229 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:33.201273 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.201240 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:33.201332 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.201287 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.208085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.208067 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.208135 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.208094 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-92.ec2.internal\": node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.224397 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.224369 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.266261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.266209 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal"] Apr 24 14:24:33.266345 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.266315 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:33.267332 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.267315 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:33.267414 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.267346 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:33.267414 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.267360 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:33.268713 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.268699 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:33.268865 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.268851 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.268925 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.268916 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:33.269422 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.269403 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:33.269489 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.269434 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:33.269489 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.269456 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:33.269489 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.269466 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:33.269574 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.269440 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:33.269608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.269586 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:33.270569 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.270556 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.270627 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.270580 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:33.271195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.271181 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:33.271270 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.271207 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:33.271270 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.271222 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:33.299778 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.299747 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-92.ec2.internal\" not found" node="ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.303462 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.303445 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-92.ec2.internal\" not found" node="ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.324966 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.324933 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.347146 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.347115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/213a7361b06e1d42b8e02764346ec94d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal\" (UID: \"213a7361b06e1d42b8e02764346ec94d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.347251 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.347152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/213a7361b06e1d42b8e02764346ec94d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal\" (UID: \"213a7361b06e1d42b8e02764346ec94d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.347251 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.347178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d4713dfd221608fbf94fe833ef06f7d0-config\") pod \"kube-apiserver-proxy-ip-10-0-143-92.ec2.internal\" (UID: \"d4713dfd221608fbf94fe833ef06f7d0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.426093 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.426019 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.447351 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.447323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/213a7361b06e1d42b8e02764346ec94d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal\" (UID: \"213a7361b06e1d42b8e02764346ec94d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.447422 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.447355 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/213a7361b06e1d42b8e02764346ec94d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal\" (UID: \"213a7361b06e1d42b8e02764346ec94d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.447422 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.447375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d4713dfd221608fbf94fe833ef06f7d0-config\") pod \"kube-apiserver-proxy-ip-10-0-143-92.ec2.internal\" (UID: \"d4713dfd221608fbf94fe833ef06f7d0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.447489 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.447426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d4713dfd221608fbf94fe833ef06f7d0-config\") pod \"kube-apiserver-proxy-ip-10-0-143-92.ec2.internal\" (UID: \"d4713dfd221608fbf94fe833ef06f7d0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.447489 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.447429 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/213a7361b06e1d42b8e02764346ec94d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal\" (UID: \"213a7361b06e1d42b8e02764346ec94d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.447489 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.447437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/213a7361b06e1d42b8e02764346ec94d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal\" (UID: \"213a7361b06e1d42b8e02764346ec94d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.526720 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.526687 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.602206 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.602170 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.605805 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.605790 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal" Apr 24 14:24:33.627205 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.627169 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.728256 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.728165 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.828714 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.828665 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.929226 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:33.929187 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:33.940353 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.940324 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 14:24:33.940515 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.940497 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:33.940561 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:33.940534 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:34.030258 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:34.030184 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:34.040689 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.040628 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 14:19:33 +0000 UTC" deadline="2027-10-21 05:47:49.810124943 +0000 UTC" Apr 24 14:24:34.040689 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.040683 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13071h23m15.769445655s" Apr 24 14:24:34.045161 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.045142 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:34.056072 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.056047 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:34.079574 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.079541 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9dmf7" Apr 24 14:24:34.087319 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.087278 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9dmf7" Apr 24 14:24:34.130960 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:34.130912 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:34.159962 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:34.159923 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4713dfd221608fbf94fe833ef06f7d0.slice/crio-fb22c08ba4af062ae356a1a6f757f32dc45515fcdc25a0aea1ffb9717a53a3eb WatchSource:0}: Error finding container fb22c08ba4af062ae356a1a6f757f32dc45515fcdc25a0aea1ffb9717a53a3eb: Status 404 returned error can't find the container with id fb22c08ba4af062ae356a1a6f757f32dc45515fcdc25a0aea1ffb9717a53a3eb Apr 24 14:24:34.160239 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:34.160210 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod213a7361b06e1d42b8e02764346ec94d.slice/crio-7395edbead65beebd137a4d8fb76126cbb6eccb2859a4d02d6e38499edcf351c WatchSource:0}: Error finding container 7395edbead65beebd137a4d8fb76126cbb6eccb2859a4d02d6e38499edcf351c: Status 404 returned error can't find the container with id 7395edbead65beebd137a4d8fb76126cbb6eccb2859a4d02d6e38499edcf351c Apr 24 14:24:34.164419 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.164403 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:24:34.168642 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.168598 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal" event={"ID":"d4713dfd221608fbf94fe833ef06f7d0","Type":"ContainerStarted","Data":"fb22c08ba4af062ae356a1a6f757f32dc45515fcdc25a0aea1ffb9717a53a3eb"} Apr 24 14:24:34.169591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.169574 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" event={"ID":"213a7361b06e1d42b8e02764346ec94d","Type":"ContainerStarted","Data":"7395edbead65beebd137a4d8fb76126cbb6eccb2859a4d02d6e38499edcf351c"} Apr 24 14:24:34.231852 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:34.231810 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-92.ec2.internal\" not found" Apr 24 14:24:34.266254 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.266225 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:34.281464 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.281395 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:34.346105 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.346075 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" Apr 24 14:24:34.378819 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.378791 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:24:34.380119 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.380105 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal" Apr 24 14:24:34.400284 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:34.400264 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:24:35.023435 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.023403 2574 apiserver.go:52] "Watching apiserver" Apr 24 14:24:35.030908 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.030865 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 14:24:35.031261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.031242 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29","openshift-cluster-node-tuning-operator/tuned-65ggn","openshift-image-registry/node-ca-6xfh9","openshift-multus/multus-additional-cni-plugins-wqgmq","openshift-multus/network-metrics-daemon-ncpbf","openshift-network-diagnostics/network-check-target-c5dxc","openshift-ovn-kubernetes/ovnkube-node-hb7mf","kube-system/konnectivity-agent-hpqs6","openshift-dns/node-resolver-dkjpq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal","openshift-multus/multus-zzdsp","openshift-network-operator/iptables-alerter-h8gj8"] Apr 24 14:24:35.033510 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.033483 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:35.033630 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.033562 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:35.034845 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.034821 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.036505 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.036477 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 14:24:35.036709 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.036484 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rxp5q\"" Apr 24 14:24:35.036709 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.036664 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 14:24:35.036899 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.036792 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 14:24:35.037425 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.037403 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.037518 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.037498 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.038776 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.038755 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.038944 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.038925 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:35.039329 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.039301 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:35.039459 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.039331 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 14:24:35.039459 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.039344 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 14:24:35.040307 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.039856 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6lvp6\"" Apr 24 14:24:35.040307 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.039969 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 14:24:35.040307 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.040185 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:35.040513 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.040399 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:35.042825 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.040896 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 14:24:35.042825 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.041482 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6brmq\"" Apr 24 14:24:35.042825 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.041577 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 14:24:35.042825 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.041750 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9fsgn\"" Apr 24 14:24:35.042825 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.041925 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 14:24:35.043482 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.043460 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 14:24:35.043590 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.043566 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 14:24:35.043689 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.043579 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.045262 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.045238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:35.045751 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.045733 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 14:24:35.046171 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.045869 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 14:24:35.046171 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.045995 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l8zq2\"" Apr 24 14:24:35.046171 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.046032 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 14:24:35.046171 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.046086 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 14:24:35.046171 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.046106 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 14:24:35.046782 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.046763 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.047133 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.047108 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 14:24:35.047243 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.047188 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 14:24:35.047417 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.047398 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-pkpvs\"" Apr 24 14:24:35.048107 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.048089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.048412 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.048393 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 14:24:35.048676 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.048659 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 14:24:35.048760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.048704 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-x9zq7\"" Apr 24 14:24:35.049089 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.049073 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 14:24:35.049598 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.049584 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.049938 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.049742 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-hzwhh\"" Apr 24 14:24:35.049938 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.049765 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 14:24:35.051201 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.051184 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dtb4c\"" Apr 24 14:24:35.051671 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.051550 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:35.051671 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.051590 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:35.051671 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.051650 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 14:24:35.056287 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056265 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-run\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.056387 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056297 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01aa8393-c538-4d6b-a611-6016be7c4a85-ovn-node-metrics-cert\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.056387 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056319 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cfa67663-2a88-4a6b-8d07-8d08e626c4f4-konnectivity-ca\") pod \"konnectivity-agent-hpqs6\" (UID: \"cfa67663-2a88-4a6b-8d07-8d08e626c4f4\") " pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:35.056387 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-sysconfig\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.056532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cfa67663-2a88-4a6b-8d07-8d08e626c4f4-agent-certs\") pod \"konnectivity-agent-hpqs6\" (UID: \"cfa67663-2a88-4a6b-8d07-8d08e626c4f4\") " pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:35.056532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056423 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96a778f3-6185-45b9-803d-403e973b65b9-serviceca\") pod \"node-ca-6xfh9\" (UID: \"96a778f3-6185-45b9-803d-403e973b65b9\") " pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.056532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/267d3672-b74b-4241-80b9-1467f130ddd8-hosts-file\") pod \"node-resolver-dkjpq\" (UID: \"267d3672-b74b-4241-80b9-1467f130ddd8\") " pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.056532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056473 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-var-lib-kubelet\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.056532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056501 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-run-netns\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.056532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-var-lib-openvswitch\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.056815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056573 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-etc-openvswitch\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.056815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01aa8393-c538-4d6b-a611-6016be7c4a85-ovnkube-config\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.056815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.056815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056725 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/267d3672-b74b-4241-80b9-1467f130ddd8-tmp-dir\") pod \"node-resolver-dkjpq\" (UID: \"267d3672-b74b-4241-80b9-1467f130ddd8\") " pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.056815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056748 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-sys-fs\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.056815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzwc5\" (UniqueName: \"kubernetes.io/projected/691da68e-c7d6-471f-9dff-22a0097a806d-kube-api-access-mzwc5\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.056815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-modprobe-d\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056818 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-cni-netd\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-etc-selinux\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056860 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-host\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056903 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056927 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzpp\" (UniqueName: \"kubernetes.io/projected/bee39eb9-c473-4f55-a88c-427f97349f6c-kube-api-access-8jzpp\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056952 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01aa8393-c538-4d6b-a611-6016be7c4a85-env-overrides\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.056972 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01aa8393-c538-4d6b-a611-6016be7c4a85-ovnkube-script-lib\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-sysctl-conf\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057056 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-lib-modules\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.057135 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057079 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/da3f377c-093d-4ab2-8473-7775d7420542-etc-tuned\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26n44\" (UniqueName: \"kubernetes.io/projected/96a778f3-6185-45b9-803d-403e973b65b9-kube-api-access-26n44\") pod \"node-ca-6xfh9\" (UID: \"96a778f3-6185-45b9-803d-403e973b65b9\") " pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-os-release\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057217 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-systemd-units\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057247 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96a778f3-6185-45b9-803d-403e973b65b9-host\") pod \"node-ca-6xfh9\" (UID: \"96a778f3-6185-45b9-803d-403e973b65b9\") " pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-slash\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-cni-bin\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057368 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-system-cni-dir\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-sysctl-d\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-systemd\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057440 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qbm\" (UniqueName: \"kubernetes.io/projected/267d3672-b74b-4241-80b9-1467f130ddd8-kube-api-access-t8qbm\") pod \"node-resolver-dkjpq\" (UID: \"267d3672-b74b-4241-80b9-1467f130ddd8\") " pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057490 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-device-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm2rd\" (UniqueName: \"kubernetes.io/projected/da3f377c-093d-4ab2-8473-7775d7420542-kube-api-access-wm2rd\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.057591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-kubelet\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-run-openvswitch\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-run-ovn\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-sys\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5bkv\" (UniqueName: \"kubernetes.io/projected/01aa8393-c538-4d6b-a611-6016be7c4a85-kube-api-access-q5bkv\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057765 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rghn6\" (UniqueName: \"kubernetes.io/projected/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-kube-api-access-rghn6\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057816 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-registration-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057839 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-kubernetes\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057892 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-log-socket\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057929 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-run-ovn-kubernetes\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057962 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-socket-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.057989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.058020 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/da3f377c-093d-4ab2-8473-7775d7420542-tmp\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.058044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-run-systemd\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.058065 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-node-log\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.058350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.058088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.059218 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.058123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-cnibin\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.088661 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.088599 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:34 +0000 UTC" deadline="2027-09-30 16:00:20.326621552 +0000 UTC" Apr 24 14:24:35.088661 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.088656 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12577h35m45.237993729s" Apr 24 14:24:35.147833 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.147794 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 14:24:35.158932 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.158900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/da3f377c-093d-4ab2-8473-7775d7420542-tmp\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.158932 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.158938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-run-systemd\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.158961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-node-log\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.158989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159016 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-cnibin\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-system-cni-dir\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-node-log\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159067 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-os-release\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-run-k8s-cni-cncf-io\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-run-systemd\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159101 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-cnibin\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.159150 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-run\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159177 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-run\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01aa8393-c538-4d6b-a611-6016be7c4a85-ovn-node-metrics-cert\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cfa67663-2a88-4a6b-8d07-8d08e626c4f4-konnectivity-ca\") pod \"konnectivity-agent-hpqs6\" (UID: \"cfa67663-2a88-4a6b-8d07-8d08e626c4f4\") " pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-var-lib-cni-multus\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159314 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159341 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zssvg\" (UniqueName: \"kubernetes.io/projected/afe77287-0ba9-4aaf-865a-6dc077e51a54-kube-api-access-zssvg\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159364 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b326316-4b11-4b19-9e26-40d9a7795d9b-host-slash\") pod \"iptables-alerter-h8gj8\" (UID: \"7b326316-4b11-4b19-9e26-40d9a7795d9b\") " pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159388 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-sysconfig\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cfa67663-2a88-4a6b-8d07-8d08e626c4f4-agent-certs\") pod \"konnectivity-agent-hpqs6\" (UID: \"cfa67663-2a88-4a6b-8d07-8d08e626c4f4\") " pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159458 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-sysconfig\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96a778f3-6185-45b9-803d-403e973b65b9-serviceca\") pod \"node-ca-6xfh9\" (UID: \"96a778f3-6185-45b9-803d-403e973b65b9\") " pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/267d3672-b74b-4241-80b9-1467f130ddd8-hosts-file\") pod \"node-resolver-dkjpq\" (UID: \"267d3672-b74b-4241-80b9-1467f130ddd8\") " pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7b326316-4b11-4b19-9e26-40d9a7795d9b-iptables-alerter-script\") pod \"iptables-alerter-h8gj8\" (UID: \"7b326316-4b11-4b19-9e26-40d9a7795d9b\") " pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.159608 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/267d3672-b74b-4241-80b9-1467f130ddd8-hosts-file\") pod \"node-resolver-dkjpq\" (UID: \"267d3672-b74b-4241-80b9-1467f130ddd8\") " pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159843 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-var-lib-kubelet\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159853 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cfa67663-2a88-4a6b-8d07-8d08e626c4f4-konnectivity-ca\") pod \"konnectivity-agent-hpqs6\" (UID: \"cfa67663-2a88-4a6b-8d07-8d08e626c4f4\") " pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-run-netns\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159929 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-socket-dir-parent\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159966 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-var-lib-kubelet\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.159969 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsw8s\" (UniqueName: \"kubernetes.io/projected/7b326316-4b11-4b19-9e26-40d9a7795d9b-kube-api-access-qsw8s\") pod \"iptables-alerter-h8gj8\" (UID: \"7b326316-4b11-4b19-9e26-40d9a7795d9b\") " pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160031 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-run-netns\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-var-lib-openvswitch\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160094 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-etc-openvswitch\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160119 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01aa8393-c538-4d6b-a611-6016be7c4a85-ovnkube-config\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160124 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-var-lib-openvswitch\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-etc-openvswitch\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/267d3672-b74b-4241-80b9-1467f130ddd8-tmp-dir\") pod \"node-resolver-dkjpq\" (UID: \"267d3672-b74b-4241-80b9-1467f130ddd8\") " pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-cni-dir\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-var-lib-cni-bin\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.160355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-sys-fs\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzwc5\" (UniqueName: \"kubernetes.io/projected/691da68e-c7d6-471f-9dff-22a0097a806d-kube-api-access-mzwc5\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160289 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-modprobe-d\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-cni-netd\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160358 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afe77287-0ba9-4aaf-865a-6dc077e51a54-cni-binary-copy\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-etc-selinux\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-host\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/267d3672-b74b-4241-80b9-1467f130ddd8-tmp-dir\") pod \"node-resolver-dkjpq\" (UID: \"267d3672-b74b-4241-80b9-1467f130ddd8\") " pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160487 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzpp\" (UniqueName: \"kubernetes.io/projected/bee39eb9-c473-4f55-a88c-427f97349f6c-kube-api-access-8jzpp\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160514 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01aa8393-c538-4d6b-a611-6016be7c4a85-env-overrides\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01aa8393-c538-4d6b-a611-6016be7c4a85-ovnkube-script-lib\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01aa8393-c538-4d6b-a611-6016be7c4a85-ovnkube-config\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-cni-netd\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160851 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-sys-fs\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.160996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-host\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-etc-selinux\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.161195 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161173 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01aa8393-c538-4d6b-a611-6016be7c4a85-ovnkube-script-lib\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161192 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-modprobe-d\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-sysctl-conf\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161232 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-lib-modules\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161256 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/da3f377c-093d-4ab2-8473-7775d7420542-etc-tuned\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26n44\" (UniqueName: \"kubernetes.io/projected/96a778f3-6185-45b9-803d-403e973b65b9-kube-api-access-26n44\") pod \"node-ca-6xfh9\" (UID: \"96a778f3-6185-45b9-803d-403e973b65b9\") " pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.161294 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-os-release\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161336 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-systemd-units\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.161381 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs podName:bee39eb9-c473-4f55-a88c-427f97349f6c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:35.6613453 +0000 UTC m=+3.099862161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs") pod "network-metrics-daemon-ncpbf" (UID: "bee39eb9-c473-4f55-a88c-427f97349f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-systemd-units\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-sysctl-conf\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96a778f3-6185-45b9-803d-403e973b65b9-host\") pod \"node-ca-6xfh9\" (UID: \"96a778f3-6185-45b9-803d-403e973b65b9\") " pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96a778f3-6185-45b9-803d-403e973b65b9-host\") pod \"node-ca-6xfh9\" (UID: \"96a778f3-6185-45b9-803d-403e973b65b9\") " pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161474 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-conf-dir\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.162003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-slash\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-cni-bin\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-system-cni-dir\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161604 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-run-netns\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161629 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-hostroot\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161657 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-etc-kubernetes\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-sysctl-d\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-systemd\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qbm\" (UniqueName: \"kubernetes.io/projected/267d3672-b74b-4241-80b9-1467f130ddd8-kube-api-access-t8qbm\") pod \"node-resolver-dkjpq\" (UID: \"267d3672-b74b-4241-80b9-1467f130ddd8\") " pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96a778f3-6185-45b9-803d-403e973b65b9-serviceca\") pod \"node-ca-6xfh9\" (UID: \"96a778f3-6185-45b9-803d-403e973b65b9\") " pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161809 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-cnibin\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161850 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-device-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161863 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-slash\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wm2rd\" (UniqueName: \"kubernetes.io/projected/da3f377c-093d-4ab2-8473-7775d7420542-kube-api-access-wm2rd\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-kubelet\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.162837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161923 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-cni-bin\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-run-openvswitch\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161961 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-system-cni-dir\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-run-ovn\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.161996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-run-ovn\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-var-lib-kubelet\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162030 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-sys\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162042 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-device-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5bkv\" (UniqueName: \"kubernetes.io/projected/01aa8393-c538-4d6b-a611-6016be7c4a85-kube-api-access-q5bkv\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01aa8393-c538-4d6b-a611-6016be7c4a85-env-overrides\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rghn6\" (UniqueName: \"kubernetes.io/projected/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-kube-api-access-rghn6\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-registration-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162166 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-lib-modules\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-kubernetes\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-log-socket\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.163644 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-kubernetes\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-run-ovn-kubernetes\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-daemon-config\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-kubelet\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162296 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-run-multus-certs\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-socket-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162336 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-systemd\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162418 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-etc-sysctl-d\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162516 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-log-socket\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162557 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-host-run-ovn-kubernetes\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162639 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/da3f377c-093d-4ab2-8473-7775d7420542-sys\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162684 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01aa8393-c538-4d6b-a611-6016be7c4a85-run-openvswitch\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-registration-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162770 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.164261 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162816 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/691da68e-c7d6-471f-9dff-22a0097a806d-socket-dir\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.164835 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-os-release\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.164835 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.162921 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01aa8393-c538-4d6b-a611-6016be7c4a85-ovn-node-metrics-cert\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.164835 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.163079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cfa67663-2a88-4a6b-8d07-8d08e626c4f4-agent-certs\") pod \"konnectivity-agent-hpqs6\" (UID: \"cfa67663-2a88-4a6b-8d07-8d08e626c4f4\") " pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:35.164835 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.163974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/da3f377c-093d-4ab2-8473-7775d7420542-tmp\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.164995 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.164930 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/da3f377c-093d-4ab2-8473-7775d7420542-etc-tuned\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.168593 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.168569 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:35.168717 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.168596 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:35.168717 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.168610 2574 projected.go:194] Error preparing data for projected volume kube-api-access-j49gn for pod openshift-network-diagnostics/network-check-target-c5dxc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:35.168717 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.168683 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn podName:27ab69b2-98a6-4382-8921-0b4c9485c514 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:35.668662887 +0000 UTC m=+3.107179753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-j49gn" (UniqueName: "kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn") pod "network-check-target-c5dxc" (UID: "27ab69b2-98a6-4382-8921-0b4c9485c514") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:35.169802 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.169777 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzwc5\" (UniqueName: \"kubernetes.io/projected/691da68e-c7d6-471f-9dff-22a0097a806d-kube-api-access-mzwc5\") pod \"aws-ebs-csi-driver-node-g8f29\" (UID: \"691da68e-c7d6-471f-9dff-22a0097a806d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.169919 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.169856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzpp\" (UniqueName: \"kubernetes.io/projected/bee39eb9-c473-4f55-a88c-427f97349f6c-kube-api-access-8jzpp\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:35.171981 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.171952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qbm\" (UniqueName: \"kubernetes.io/projected/267d3672-b74b-4241-80b9-1467f130ddd8-kube-api-access-t8qbm\") pod \"node-resolver-dkjpq\" (UID: \"267d3672-b74b-4241-80b9-1467f130ddd8\") " pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.172170 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.172143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26n44\" (UniqueName: \"kubernetes.io/projected/96a778f3-6185-45b9-803d-403e973b65b9-kube-api-access-26n44\") pod \"node-ca-6xfh9\" (UID: \"96a778f3-6185-45b9-803d-403e973b65b9\") " pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.172492 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.172474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rghn6\" (UniqueName: \"kubernetes.io/projected/dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef-kube-api-access-rghn6\") pod \"multus-additional-cni-plugins-wqgmq\" (UID: \"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef\") " pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.172634 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.172615 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm2rd\" (UniqueName: \"kubernetes.io/projected/da3f377c-093d-4ab2-8473-7775d7420542-kube-api-access-wm2rd\") pod \"tuned-65ggn\" (UID: \"da3f377c-093d-4ab2-8473-7775d7420542\") " pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.172766 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.172741 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5bkv\" (UniqueName: \"kubernetes.io/projected/01aa8393-c538-4d6b-a611-6016be7c4a85-kube-api-access-q5bkv\") pod \"ovnkube-node-hb7mf\" (UID: \"01aa8393-c538-4d6b-a611-6016be7c4a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.191059 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.191032 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:35.262950 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.262916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-run-k8s-cni-cncf-io\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.262963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-var-lib-cni-multus\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.262991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zssvg\" (UniqueName: \"kubernetes.io/projected/afe77287-0ba9-4aaf-865a-6dc077e51a54-kube-api-access-zssvg\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263018 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b326316-4b11-4b19-9e26-40d9a7795d9b-host-slash\") pod \"iptables-alerter-h8gj8\" (UID: \"7b326316-4b11-4b19-9e26-40d9a7795d9b\") " pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.263128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-run-k8s-cni-cncf-io\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263045 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7b326316-4b11-4b19-9e26-40d9a7795d9b-iptables-alerter-script\") pod \"iptables-alerter-h8gj8\" (UID: \"7b326316-4b11-4b19-9e26-40d9a7795d9b\") " pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.263128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263107 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-var-lib-cni-multus\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263113 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-socket-dir-parent\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsw8s\" (UniqueName: \"kubernetes.io/projected/7b326316-4b11-4b19-9e26-40d9a7795d9b-kube-api-access-qsw8s\") pod \"iptables-alerter-h8gj8\" (UID: \"7b326316-4b11-4b19-9e26-40d9a7795d9b\") " pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b326316-4b11-4b19-9e26-40d9a7795d9b-host-slash\") pod \"iptables-alerter-h8gj8\" (UID: \"7b326316-4b11-4b19-9e26-40d9a7795d9b\") " pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-cni-dir\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-var-lib-cni-bin\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-socket-dir-parent\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263235 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afe77287-0ba9-4aaf-865a-6dc077e51a54-cni-binary-copy\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263279 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-conf-dir\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-run-netns\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263311 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-hostroot\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263326 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-etc-kubernetes\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-cnibin\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-var-lib-kubelet\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-daemon-config\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-conf-dir\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263404 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-run-multus-certs\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263425 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-system-cni-dir\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-os-release\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263449 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-cni-dir\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-os-release\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-cnibin\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-var-lib-kubelet\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-run-netns\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7b326316-4b11-4b19-9e26-40d9a7795d9b-iptables-alerter-script\") pod \"iptables-alerter-h8gj8\" (UID: \"7b326316-4b11-4b19-9e26-40d9a7795d9b\") " pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-var-lib-cni-bin\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263637 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-host-run-multus-certs\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-hostroot\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-etc-kubernetes\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.263744 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263714 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afe77287-0ba9-4aaf-865a-6dc077e51a54-system-cni-dir\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.264292 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263951 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afe77287-0ba9-4aaf-865a-6dc077e51a54-cni-binary-copy\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.264292 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.263991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/afe77287-0ba9-4aaf-865a-6dc077e51a54-multus-daemon-config\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.272154 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.272116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsw8s\" (UniqueName: \"kubernetes.io/projected/7b326316-4b11-4b19-9e26-40d9a7795d9b-kube-api-access-qsw8s\") pod \"iptables-alerter-h8gj8\" (UID: \"7b326316-4b11-4b19-9e26-40d9a7795d9b\") " pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.272574 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.272553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zssvg\" (UniqueName: \"kubernetes.io/projected/afe77287-0ba9-4aaf-865a-6dc077e51a54-kube-api-access-zssvg\") pod \"multus-zzdsp\" (UID: \"afe77287-0ba9-4aaf-865a-6dc077e51a54\") " pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.355221 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.355142 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" Apr 24 14:24:35.362000 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.361972 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-65ggn" Apr 24 14:24:35.369691 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.369664 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6xfh9" Apr 24 14:24:35.374307 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.374285 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" Apr 24 14:24:35.381003 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.380979 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:35.387625 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.387604 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:35.393159 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.393133 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dkjpq" Apr 24 14:24:35.400799 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.400776 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zzdsp" Apr 24 14:24:35.407398 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.407377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h8gj8" Apr 24 14:24:35.493228 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.493194 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:35.665896 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.665801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:35.666041 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.665985 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:35.666100 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.666060 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs podName:bee39eb9-c473-4f55-a88c-427f97349f6c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:36.666040118 +0000 UTC m=+4.104556957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs") pod "network-metrics-daemon-ncpbf" (UID: "bee39eb9-c473-4f55-a88c-427f97349f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:35.766347 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:35.766314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:35.766504 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.766487 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:35.766545 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.766510 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:35.766545 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.766521 2574 projected.go:194] Error preparing data for projected volume kube-api-access-j49gn for pod openshift-network-diagnostics/network-check-target-c5dxc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:35.766613 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:35.766576 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn podName:27ab69b2-98a6-4382-8921-0b4c9485c514 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:36.766561251 +0000 UTC m=+4.205078089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-j49gn" (UniqueName: "kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn") pod "network-check-target-c5dxc" (UID: "27ab69b2-98a6-4382-8921-0b4c9485c514") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:35.785896 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:35.785700 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3f377c_093d_4ab2_8473_7775d7420542.slice/crio-919904f1157dc91414694cf3d11f7b0710bb061b6e525c8d9dbe48df074f25d5 WatchSource:0}: Error finding container 919904f1157dc91414694cf3d11f7b0710bb061b6e525c8d9dbe48df074f25d5: Status 404 returned error can't find the container with id 919904f1157dc91414694cf3d11f7b0710bb061b6e525c8d9dbe48df074f25d5 Apr 24 14:24:35.788135 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:35.788105 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa67663_2a88_4a6b_8d07_8d08e626c4f4.slice/crio-ef140dfe7774d0f32ab35390ffef2cda584e5bf08a240184af12b58fe1eb3e74 WatchSource:0}: Error finding container ef140dfe7774d0f32ab35390ffef2cda584e5bf08a240184af12b58fe1eb3e74: Status 404 returned error can't find the container with id ef140dfe7774d0f32ab35390ffef2cda584e5bf08a240184af12b58fe1eb3e74 Apr 24 14:24:35.789992 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:35.789971 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a778f3_6185_45b9_803d_403e973b65b9.slice/crio-59fb5b08890537a61ecaba72d7f9243f2c9062d27aba58439262a6065f461395 WatchSource:0}: Error finding container 59fb5b08890537a61ecaba72d7f9243f2c9062d27aba58439262a6065f461395: Status 404 returned error can't find the container with id 59fb5b08890537a61ecaba72d7f9243f2c9062d27aba58439262a6065f461395 Apr 24 14:24:35.792526 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:35.791146 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b326316_4b11_4b19_9e26_40d9a7795d9b.slice/crio-3f4c84dca460f04aa851f43dd28f5e8bef0ccdeb208a9c3b18bf7bba6526d47c WatchSource:0}: Error finding container 3f4c84dca460f04aa851f43dd28f5e8bef0ccdeb208a9c3b18bf7bba6526d47c: Status 404 returned error can't find the container with id 3f4c84dca460f04aa851f43dd28f5e8bef0ccdeb208a9c3b18bf7bba6526d47c Apr 24 14:24:35.794137 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:35.794112 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267d3672_b74b_4241_80b9_1467f130ddd8.slice/crio-8bbf1b8efd77eab1e11bb7da692ea5e2e5a42065000d531c331a04c7519abc18 WatchSource:0}: Error finding container 8bbf1b8efd77eab1e11bb7da692ea5e2e5a42065000d531c331a04c7519abc18: Status 404 returned error can't find the container with id 8bbf1b8efd77eab1e11bb7da692ea5e2e5a42065000d531c331a04c7519abc18 Apr 24 14:24:35.796187 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:35.796068 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafe77287_0ba9_4aaf_865a_6dc077e51a54.slice/crio-5f2dfe8f37aab58bb34f8bebdc0ebca4e74a22befd5156a16731969b2745fcef WatchSource:0}: Error finding container 5f2dfe8f37aab58bb34f8bebdc0ebca4e74a22befd5156a16731969b2745fcef: Status 404 returned error can't find the container with id 5f2dfe8f37aab58bb34f8bebdc0ebca4e74a22befd5156a16731969b2745fcef Apr 24 14:24:35.797059 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:35.797026 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbfdd18f_56c4_4ff4_9933_f7b18e2d71ef.slice/crio-2725c858f681c31a48930557d6de672a6851f591ef89536ff030681ed4aa6f15 WatchSource:0}: Error finding container 2725c858f681c31a48930557d6de672a6851f591ef89536ff030681ed4aa6f15: Status 404 returned error can't find the container with id 2725c858f681c31a48930557d6de672a6851f591ef89536ff030681ed4aa6f15 Apr 24 14:24:35.797698 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:35.797673 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01aa8393_c538_4d6b_a611_6016be7c4a85.slice/crio-bd8978d7769acdef97b76b569852cc5a68bdef51a3452326f8732a198e82eda3 WatchSource:0}: Error finding container bd8978d7769acdef97b76b569852cc5a68bdef51a3452326f8732a198e82eda3: Status 404 returned error can't find the container with id bd8978d7769acdef97b76b569852cc5a68bdef51a3452326f8732a198e82eda3 Apr 24 14:24:35.798507 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:24:35.798477 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod691da68e_c7d6_471f_9dff_22a0097a806d.slice/crio-d7fcb0694bf013de715d81eba9a7abfeb0e97afd72ac05a0e2c518bdc757be52 WatchSource:0}: Error finding container d7fcb0694bf013de715d81eba9a7abfeb0e97afd72ac05a0e2c518bdc757be52: Status 404 returned error can't find the container with id d7fcb0694bf013de715d81eba9a7abfeb0e97afd72ac05a0e2c518bdc757be52 Apr 24 14:24:36.089102 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.088986 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:34 +0000 UTC" deadline="2027-11-25 10:26:45.929472846 +0000 UTC" Apr 24 14:24:36.089102 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.089028 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13916h2m9.840448871s" Apr 24 14:24:36.177557 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.177497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal" event={"ID":"d4713dfd221608fbf94fe833ef06f7d0","Type":"ContainerStarted","Data":"78209f21dc7f184a13c5c21fa205de01f59f0da08606b68b9987b227fd05313b"} Apr 24 14:24:36.181864 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.181552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" event={"ID":"691da68e-c7d6-471f-9dff-22a0097a806d","Type":"ContainerStarted","Data":"d7fcb0694bf013de715d81eba9a7abfeb0e97afd72ac05a0e2c518bdc757be52"} Apr 24 14:24:36.184248 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.184219 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" event={"ID":"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef","Type":"ContainerStarted","Data":"2725c858f681c31a48930557d6de672a6851f591ef89536ff030681ed4aa6f15"} Apr 24 14:24:36.188541 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.188464 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zzdsp" event={"ID":"afe77287-0ba9-4aaf-865a-6dc077e51a54","Type":"ContainerStarted","Data":"5f2dfe8f37aab58bb34f8bebdc0ebca4e74a22befd5156a16731969b2745fcef"} Apr 24 14:24:36.191247 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.191160 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-65ggn" event={"ID":"da3f377c-093d-4ab2-8473-7775d7420542","Type":"ContainerStarted","Data":"919904f1157dc91414694cf3d11f7b0710bb061b6e525c8d9dbe48df074f25d5"} Apr 24 14:24:36.194466 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.194437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" event={"ID":"01aa8393-c538-4d6b-a611-6016be7c4a85","Type":"ContainerStarted","Data":"bd8978d7769acdef97b76b569852cc5a68bdef51a3452326f8732a198e82eda3"} Apr 24 14:24:36.196353 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.196326 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dkjpq" event={"ID":"267d3672-b74b-4241-80b9-1467f130ddd8","Type":"ContainerStarted","Data":"8bbf1b8efd77eab1e11bb7da692ea5e2e5a42065000d531c331a04c7519abc18"} Apr 24 14:24:36.198256 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.198207 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h8gj8" event={"ID":"7b326316-4b11-4b19-9e26-40d9a7795d9b","Type":"ContainerStarted","Data":"3f4c84dca460f04aa851f43dd28f5e8bef0ccdeb208a9c3b18bf7bba6526d47c"} Apr 24 14:24:36.200128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.200100 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6xfh9" event={"ID":"96a778f3-6185-45b9-803d-403e973b65b9","Type":"ContainerStarted","Data":"59fb5b08890537a61ecaba72d7f9243f2c9062d27aba58439262a6065f461395"} Apr 24 14:24:36.203622 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.203597 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hpqs6" event={"ID":"cfa67663-2a88-4a6b-8d07-8d08e626c4f4","Type":"ContainerStarted","Data":"ef140dfe7774d0f32ab35390ffef2cda584e5bf08a240184af12b58fe1eb3e74"} Apr 24 14:24:36.673323 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.672695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:36.673323 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:36.672915 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:36.673323 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:36.672977 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs podName:bee39eb9-c473-4f55-a88c-427f97349f6c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:38.672958228 +0000 UTC m=+6.111475084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs") pod "network-metrics-daemon-ncpbf" (UID: "bee39eb9-c473-4f55-a88c-427f97349f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:36.773557 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:36.773516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:36.773752 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:36.773727 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:36.773752 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:36.773747 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:36.773853 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:36.773760 2574 projected.go:194] Error preparing data for projected volume kube-api-access-j49gn for pod openshift-network-diagnostics/network-check-target-c5dxc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:36.773853 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:36.773820 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn podName:27ab69b2-98a6-4382-8921-0b4c9485c514 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:38.773801331 +0000 UTC m=+6.212318172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-j49gn" (UniqueName: "kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn") pod "network-check-target-c5dxc" (UID: "27ab69b2-98a6-4382-8921-0b4c9485c514") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:37.167101 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.167067 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:37.167600 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:37.167205 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:37.167600 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.167255 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:37.167600 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:37.167378 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:37.221308 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.220033 2574 generic.go:358] "Generic (PLEG): container finished" podID="213a7361b06e1d42b8e02764346ec94d" containerID="197cf15662e9f0c257794eabcc240f5fa921bf713e4194025843fa528aad1d9d" exitCode=0 Apr 24 14:24:37.221308 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.221045 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" event={"ID":"213a7361b06e1d42b8e02764346ec94d","Type":"ContainerDied","Data":"197cf15662e9f0c257794eabcc240f5fa921bf713e4194025843fa528aad1d9d"} Apr 24 14:24:37.235870 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.235807 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-92.ec2.internal" podStartSLOduration=3.235783949 podStartE2EDuration="3.235783949s" podCreationTimestamp="2026-04-24 14:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:36.190532115 +0000 UTC m=+3.629048975" watchObservedRunningTime="2026-04-24 14:24:37.235783949 +0000 UTC m=+4.674300814" Apr 24 14:24:37.501508 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.501421 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hpfkc"] Apr 24 14:24:37.504056 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.504030 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:37.504197 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:37.504114 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:37.581869 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.581767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e644e633-8a12-412b-b1a2-812f9fe784ed-kubelet-config\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:37.582056 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.582034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:37.582121 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.582101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e644e633-8a12-412b-b1a2-812f9fe784ed-dbus\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:37.682999 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.682958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e644e633-8a12-412b-b1a2-812f9fe784ed-dbus\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:37.683191 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.683024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e644e633-8a12-412b-b1a2-812f9fe784ed-kubelet-config\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:37.683191 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.683095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:37.683306 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:37.683240 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:37.683357 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:37.683305 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret podName:e644e633-8a12-412b-b1a2-812f9fe784ed nodeName:}" failed. No retries permitted until 2026-04-24 14:24:38.183286824 +0000 UTC m=+5.621803667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret") pod "global-pull-secret-syncer-hpfkc" (UID: "e644e633-8a12-412b-b1a2-812f9fe784ed") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:37.683691 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.683670 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e644e633-8a12-412b-b1a2-812f9fe784ed-dbus\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:37.683759 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:37.683742 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e644e633-8a12-412b-b1a2-812f9fe784ed-kubelet-config\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:38.188395 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:38.188352 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:38.188937 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:38.188509 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:38.188937 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:38.188585 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret podName:e644e633-8a12-412b-b1a2-812f9fe784ed nodeName:}" failed. No retries permitted until 2026-04-24 14:24:39.188565447 +0000 UTC m=+6.627082301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret") pod "global-pull-secret-syncer-hpfkc" (UID: "e644e633-8a12-412b-b1a2-812f9fe784ed") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:38.226642 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:38.226603 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" event={"ID":"213a7361b06e1d42b8e02764346ec94d","Type":"ContainerStarted","Data":"839fe2a141b97734456fd9c18133ef0a95a7f9246f49c9b89289e08e04acbc94"} Apr 24 14:24:38.240347 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:38.240287 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-92.ec2.internal" podStartSLOduration=4.240267808 podStartE2EDuration="4.240267808s" podCreationTimestamp="2026-04-24 14:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:38.240150799 +0000 UTC m=+5.678667660" watchObservedRunningTime="2026-04-24 14:24:38.240267808 +0000 UTC m=+5.678784666" Apr 24 14:24:38.696082 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:38.695428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:38.696082 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:38.695645 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:38.696082 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:38.695710 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs podName:bee39eb9-c473-4f55-a88c-427f97349f6c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:42.695691726 +0000 UTC m=+10.134208571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs") pod "network-metrics-daemon-ncpbf" (UID: "bee39eb9-c473-4f55-a88c-427f97349f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:38.796631 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:38.796592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:38.796809 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:38.796788 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:38.796869 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:38.796811 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:38.796869 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:38.796824 2574 projected.go:194] Error preparing data for projected volume kube-api-access-j49gn for pod openshift-network-diagnostics/network-check-target-c5dxc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:38.796999 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:38.796911 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn podName:27ab69b2-98a6-4382-8921-0b4c9485c514 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:42.79689127 +0000 UTC m=+10.235408123 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-j49gn" (UniqueName: "kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn") pod "network-check-target-c5dxc" (UID: "27ab69b2-98a6-4382-8921-0b4c9485c514") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:39.165870 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:39.165769 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:39.166055 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:39.165968 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:39.166375 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:39.165769 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:39.166375 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:39.166285 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:39.166519 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:39.166398 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:39.166519 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:39.166507 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:39.200225 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:39.200170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:39.200686 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:39.200367 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:39.200686 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:39.200464 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret podName:e644e633-8a12-412b-b1a2-812f9fe784ed nodeName:}" failed. No retries permitted until 2026-04-24 14:24:41.200430855 +0000 UTC m=+8.638947697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret") pod "global-pull-secret-syncer-hpfkc" (UID: "e644e633-8a12-412b-b1a2-812f9fe784ed") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:41.165446 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:41.165409 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:41.165919 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:41.165544 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:41.165919 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:41.165586 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:41.165919 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:41.165652 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:41.165919 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:41.165691 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:41.165919 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:41.165768 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:41.220029 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:41.219928 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:41.220218 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:41.220099 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:41.220218 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:41.220168 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret podName:e644e633-8a12-412b-b1a2-812f9fe784ed nodeName:}" failed. No retries permitted until 2026-04-24 14:24:45.220148631 +0000 UTC m=+12.658665473 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret") pod "global-pull-secret-syncer-hpfkc" (UID: "e644e633-8a12-412b-b1a2-812f9fe784ed") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:42.733055 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:42.733013 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:42.733633 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:42.733168 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:42.733633 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:42.733243 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs podName:bee39eb9-c473-4f55-a88c-427f97349f6c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:50.733226591 +0000 UTC m=+18.171743429 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs") pod "network-metrics-daemon-ncpbf" (UID: "bee39eb9-c473-4f55-a88c-427f97349f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:42.833557 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:42.833511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:42.833745 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:42.833727 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:42.833808 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:42.833754 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:42.833808 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:42.833768 2574 projected.go:194] Error preparing data for projected volume kube-api-access-j49gn for pod openshift-network-diagnostics/network-check-target-c5dxc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:42.833929 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:42.833835 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn podName:27ab69b2-98a6-4382-8921-0b4c9485c514 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:50.833815092 +0000 UTC m=+18.272331934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-j49gn" (UniqueName: "kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn") pod "network-check-target-c5dxc" (UID: "27ab69b2-98a6-4382-8921-0b4c9485c514") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:43.169655 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:43.169623 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:43.169851 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:43.169745 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:43.169851 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:43.169763 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:43.169851 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:43.169845 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:43.170082 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:43.169913 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:43.170082 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:43.169975 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:45.166004 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:45.165966 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:45.166469 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:45.166099 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:45.166469 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:45.165966 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:45.166469 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:45.166181 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:45.166469 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:45.165966 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:45.166469 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:45.166259 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:45.253960 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:45.253923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:45.254150 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:45.254059 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:45.254150 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:45.254125 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret podName:e644e633-8a12-412b-b1a2-812f9fe784ed nodeName:}" failed. No retries permitted until 2026-04-24 14:24:53.254110928 +0000 UTC m=+20.692627766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret") pod "global-pull-secret-syncer-hpfkc" (UID: "e644e633-8a12-412b-b1a2-812f9fe784ed") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:47.166344 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:47.166255 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:47.166344 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:47.166300 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:47.166833 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:47.166252 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:47.166833 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:47.166386 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:47.166833 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:47.166479 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:47.166833 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:47.166592 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:49.166217 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:49.166178 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:49.166695 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:49.166178 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:49.166695 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:49.166317 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:49.166695 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:49.166201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:49.166695 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:49.166389 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:49.166695 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:49.166458 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:50.787932 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:50.787875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:50.788430 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:50.788054 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:50.788430 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:50.788136 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs podName:bee39eb9-c473-4f55-a88c-427f97349f6c nodeName:}" failed. No retries permitted until 2026-04-24 14:25:06.788118945 +0000 UTC m=+34.226635786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs") pod "network-metrics-daemon-ncpbf" (UID: "bee39eb9-c473-4f55-a88c-427f97349f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:50.889104 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:50.889060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:50.889297 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:50.889241 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:50.889297 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:50.889259 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:50.889297 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:50.889271 2574 projected.go:194] Error preparing data for projected volume kube-api-access-j49gn for pod openshift-network-diagnostics/network-check-target-c5dxc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:50.889444 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:50.889327 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn podName:27ab69b2-98a6-4382-8921-0b4c9485c514 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:06.889310094 +0000 UTC m=+34.327826936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-j49gn" (UniqueName: "kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn") pod "network-check-target-c5dxc" (UID: "27ab69b2-98a6-4382-8921-0b4c9485c514") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:51.165995 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:51.165960 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:51.166192 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:51.165964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:51.166192 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:51.166087 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:51.166302 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:51.166193 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:51.166302 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:51.165964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:51.166399 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:51.166309 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:53.166765 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.166584 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:53.167406 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.166675 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:53.167406 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:53.166853 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:53.167406 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.166702 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:53.167406 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:53.166992 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:53.167406 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:53.167016 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:53.252519 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.252489 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" event={"ID":"691da68e-c7d6-471f-9dff-22a0097a806d","Type":"ContainerStarted","Data":"511a658dd8ae77d8a286372b97a284490975905b1fdb21a9a611dd1f9fbbd71b"} Apr 24 14:24:53.253874 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.253849 2574 generic.go:358] "Generic (PLEG): container finished" podID="dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef" containerID="fbf598cdbd33aa06aa9b41089879e89e1e609e1327997a8ff48baa9e47fdda4d" exitCode=0 Apr 24 14:24:53.253984 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.253916 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" event={"ID":"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef","Type":"ContainerDied","Data":"fbf598cdbd33aa06aa9b41089879e89e1e609e1327997a8ff48baa9e47fdda4d"} Apr 24 14:24:53.255272 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.255199 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zzdsp" event={"ID":"afe77287-0ba9-4aaf-865a-6dc077e51a54","Type":"ContainerStarted","Data":"2e1d6ddd7ee208feccd8531e8f9452ce56e2ed0a6b052a207f0f4f40f7d6e6c1"} Apr 24 14:24:53.256414 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.256393 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-65ggn" event={"ID":"da3f377c-093d-4ab2-8473-7775d7420542","Type":"ContainerStarted","Data":"e63a1290943fe79292566bfadc13ced53df5c2720569500612ced31e22fee992"} Apr 24 14:24:53.259014 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.258989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" event={"ID":"01aa8393-c538-4d6b-a611-6016be7c4a85","Type":"ContainerStarted","Data":"fa9dfe7a7d68e03413554ab1210da05c30e002e2b16ac80942828a8b5b946049"} Apr 24 14:24:53.259090 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.259021 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" event={"ID":"01aa8393-c538-4d6b-a611-6016be7c4a85","Type":"ContainerStarted","Data":"0e28157e9ca9387b08c7d101322f061e28c3835bc3272b4e024eec35a1ab64b3"} Apr 24 14:24:53.259090 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.259035 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" event={"ID":"01aa8393-c538-4d6b-a611-6016be7c4a85","Type":"ContainerStarted","Data":"d8db64603d6029b32b859cf357db0e33d42a0aaa59a2ba51858448f33ace7ab9"} Apr 24 14:24:53.259090 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.259046 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" event={"ID":"01aa8393-c538-4d6b-a611-6016be7c4a85","Type":"ContainerStarted","Data":"c2e67cfd9bc7fdb458a80a375c0ee97f4efaed48b4733abaf61f4769755073e4"} Apr 24 14:24:53.259090 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.259058 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" event={"ID":"01aa8393-c538-4d6b-a611-6016be7c4a85","Type":"ContainerStarted","Data":"9cd7cee41671cec2cbb8e940c57b1322bdb13252b5efe0e36384ccfee54d7ac2"} Apr 24 14:24:53.260386 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.260358 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dkjpq" event={"ID":"267d3672-b74b-4241-80b9-1467f130ddd8","Type":"ContainerStarted","Data":"03497889e7856d74c7a9396cb0e92cda08ef8ffb386db52b74041a64575b3b61"} Apr 24 14:24:53.261562 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.261543 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6xfh9" event={"ID":"96a778f3-6185-45b9-803d-403e973b65b9","Type":"ContainerStarted","Data":"31db7e8cc4837f081b68c31ac32190457b786bae1985c1331ec081c8cdb0eb86"} Apr 24 14:24:53.262673 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.262655 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hpqs6" event={"ID":"cfa67663-2a88-4a6b-8d07-8d08e626c4f4","Type":"ContainerStarted","Data":"d340966b4895d9e081eb9b6031626b3cf200b5c381283f497ce376a18130cdd4"} Apr 24 14:24:53.301392 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.301340 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hpqs6" podStartSLOduration=3.479697393 podStartE2EDuration="20.301324177s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:35.790093691 +0000 UTC m=+3.228610546" lastFinishedPulling="2026-04-24 14:24:52.611720471 +0000 UTC m=+20.050237330" observedRunningTime="2026-04-24 14:24:53.300980455 +0000 UTC m=+20.739497317" watchObservedRunningTime="2026-04-24 14:24:53.301324177 +0000 UTC m=+20.739841037" Apr 24 14:24:53.310089 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.310008 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:53.310751 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:53.310533 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:53.310928 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:53.310790 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret podName:e644e633-8a12-412b-b1a2-812f9fe784ed nodeName:}" failed. No retries permitted until 2026-04-24 14:25:09.310741267 +0000 UTC m=+36.749258311 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret") pod "global-pull-secret-syncer-hpfkc" (UID: "e644e633-8a12-412b-b1a2-812f9fe784ed") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:53.327503 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.327463 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dkjpq" podStartSLOduration=3.512479076 podStartE2EDuration="20.327448396s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:35.796761325 +0000 UTC m=+3.235278164" lastFinishedPulling="2026-04-24 14:24:52.611730641 +0000 UTC m=+20.050247484" observedRunningTime="2026-04-24 14:24:53.326990756 +0000 UTC m=+20.765507617" watchObservedRunningTime="2026-04-24 14:24:53.327448396 +0000 UTC m=+20.765965255" Apr 24 14:24:53.347221 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.347168 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zzdsp" podStartSLOduration=3.494995999 podStartE2EDuration="20.347151882s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:35.798000392 +0000 UTC m=+3.236517242" lastFinishedPulling="2026-04-24 14:24:52.650156287 +0000 UTC m=+20.088673125" observedRunningTime="2026-04-24 14:24:53.346845922 +0000 UTC m=+20.785362783" watchObservedRunningTime="2026-04-24 14:24:53.347151882 +0000 UTC m=+20.785668741" Apr 24 14:24:53.363365 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.363313 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-65ggn" podStartSLOduration=3.536757689 podStartE2EDuration="20.363297145s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:35.787180984 +0000 UTC m=+3.225697825" lastFinishedPulling="2026-04-24 14:24:52.613720439 +0000 UTC m=+20.052237281" observedRunningTime="2026-04-24 14:24:53.363146814 +0000 UTC m=+20.801663675" watchObservedRunningTime="2026-04-24 14:24:53.363297145 +0000 UTC m=+20.801814019" Apr 24 14:24:53.377614 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.377560 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6xfh9" podStartSLOduration=8.066251655 podStartE2EDuration="20.377545693s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:35.793350215 +0000 UTC m=+3.231867065" lastFinishedPulling="2026-04-24 14:24:48.104644266 +0000 UTC m=+15.543161103" observedRunningTime="2026-04-24 14:24:53.377085322 +0000 UTC m=+20.815602184" watchObservedRunningTime="2026-04-24 14:24:53.377545693 +0000 UTC m=+20.816062553" Apr 24 14:24:53.894807 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:53.894766 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 14:24:54.109483 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:54.109307 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T14:24:53.894787255Z","UUID":"43948b64-4c08-4c87-8938-6d43750b6840","Handler":null,"Name":"","Endpoint":""} Apr 24 14:24:54.111735 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:54.111701 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 14:24:54.111735 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:54.111733 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 14:24:54.266685 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:54.266424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" event={"ID":"691da68e-c7d6-471f-9dff-22a0097a806d","Type":"ContainerStarted","Data":"15c68aa89e1b948f6337a5599799fbd4015c2bca55650afd9c5f9d9d981bb493"} Apr 24 14:24:54.269432 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:54.269401 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" event={"ID":"01aa8393-c538-4d6b-a611-6016be7c4a85","Type":"ContainerStarted","Data":"be05b6f936923103a8379e66593cde8437851fd89b875ecad253e3ffac66c36a"} Apr 24 14:24:54.270893 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:54.270855 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h8gj8" event={"ID":"7b326316-4b11-4b19-9e26-40d9a7795d9b","Type":"ContainerStarted","Data":"46cfca4cd4b42985099af901cd6112e0d08d6c4023eae002a6af1ea8ce89fe57"} Apr 24 14:24:54.287074 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:54.287006 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-h8gj8" podStartSLOduration=4.4696038080000005 podStartE2EDuration="21.286985408s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:35.7943497 +0000 UTC m=+3.232866554" lastFinishedPulling="2026-04-24 14:24:52.611731309 +0000 UTC m=+20.050248154" observedRunningTime="2026-04-24 14:24:54.286201553 +0000 UTC m=+21.724718415" watchObservedRunningTime="2026-04-24 14:24:54.286985408 +0000 UTC m=+21.725502269" Apr 24 14:24:55.165569 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:55.165533 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:55.165771 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:55.165533 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:55.165771 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:55.165668 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:55.165771 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:55.165533 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:55.165771 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:55.165760 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:55.165970 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:55.165851 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:55.274747 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:55.274706 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" event={"ID":"691da68e-c7d6-471f-9dff-22a0097a806d","Type":"ContainerStarted","Data":"b662370c95306cc60e481f76923508d055ac226a901032fe6d75cdc6fbeb5aea"} Apr 24 14:24:55.292244 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:55.292188 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g8f29" podStartSLOduration=3.306356299 podStartE2EDuration="22.292169497s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:35.800392933 +0000 UTC m=+3.238909773" lastFinishedPulling="2026-04-24 14:24:54.786206133 +0000 UTC m=+22.224722971" observedRunningTime="2026-04-24 14:24:55.291463387 +0000 UTC m=+22.729980272" watchObservedRunningTime="2026-04-24 14:24:55.292169497 +0000 UTC m=+22.730686359" Apr 24 14:24:56.279444 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:56.279403 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" event={"ID":"01aa8393-c538-4d6b-a611-6016be7c4a85","Type":"ContainerStarted","Data":"b116347b5b5447588ceebae5d16fff12311ab0514d3b43725d8962b92f5fdaa1"} Apr 24 14:24:57.166230 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:57.166198 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:57.166230 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:57.166228 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:57.166429 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:57.166237 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:57.166429 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:57.166307 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:57.166429 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:57.166393 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:57.166523 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:57.166452 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:57.817874 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:57.817670 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:57.818440 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:57.818293 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:58.284316 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.284285 2574 generic.go:358] "Generic (PLEG): container finished" podID="dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef" containerID="f99d5a8b43380b88dff0c7412d0b1d2f3958a343650e661b7671bc3a204f86c4" exitCode=0 Apr 24 14:24:58.284514 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.284365 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" event={"ID":"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef","Type":"ContainerDied","Data":"f99d5a8b43380b88dff0c7412d0b1d2f3958a343650e661b7671bc3a204f86c4"} Apr 24 14:24:58.290756 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.290725 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" event={"ID":"01aa8393-c538-4d6b-a611-6016be7c4a85","Type":"ContainerStarted","Data":"c23907d6a5601224edd2a92d518e5c7e383b87a6df6e267df7cc3a29a27738bc"} Apr 24 14:24:58.291020 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.290992 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:58.291097 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.291034 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:58.291097 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.291047 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:58.291097 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.291059 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:58.291611 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.291593 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hpqs6" Apr 24 14:24:58.306404 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.306379 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:58.306522 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.306444 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:24:58.343837 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:58.343777 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" podStartSLOduration=8.292784351 podStartE2EDuration="25.343759089s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:35.799695516 +0000 UTC m=+3.238212356" lastFinishedPulling="2026-04-24 14:24:52.850670242 +0000 UTC m=+20.289187094" observedRunningTime="2026-04-24 14:24:58.343178228 +0000 UTC m=+25.781695089" watchObservedRunningTime="2026-04-24 14:24:58.343759089 +0000 UTC m=+25.782275948" Apr 24 14:24:59.165586 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:59.165546 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:59.165973 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:59.165546 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:59.165973 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:59.165662 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:24:59.165973 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:59.165546 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:59.165973 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:59.165736 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:59.165973 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:59.165833 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:59.531255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:59.530704 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hpfkc"] Apr 24 14:24:59.531255 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:59.531087 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:24:59.531255 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:59.531226 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:24:59.534444 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:59.534398 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ncpbf"] Apr 24 14:24:59.534570 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:59.534555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:24:59.534699 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:59.534679 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:24:59.535093 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:59.535070 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-c5dxc"] Apr 24 14:24:59.535165 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:24:59.535154 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:24:59.536912 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:24:59.535611 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:25:00.296281 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:00.296240 2574 generic.go:358] "Generic (PLEG): container finished" podID="dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef" containerID="ddb6161ac1d0d294d3c727e66c13e87d2d1b98b0cd9bd69ba8c20fc508dd3097" exitCode=0 Apr 24 14:25:00.296795 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:00.296318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" event={"ID":"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef","Type":"ContainerDied","Data":"ddb6161ac1d0d294d3c727e66c13e87d2d1b98b0cd9bd69ba8c20fc508dd3097"} Apr 24 14:25:01.165822 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:01.165783 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:25:01.165822 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:01.165827 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:25:01.166037 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:01.165930 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:25:01.166037 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:01.166020 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:25:01.166099 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:01.166067 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:25:01.166165 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:01.166148 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:25:02.302131 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:02.302087 2574 generic.go:358] "Generic (PLEG): container finished" podID="dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef" containerID="2ad9bf4a453f6f9bf1f5fa6b6ebf17ef781fbf151018b2d91e8eda86b9adc55a" exitCode=0 Apr 24 14:25:02.302705 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:02.302163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" event={"ID":"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef","Type":"ContainerDied","Data":"2ad9bf4a453f6f9bf1f5fa6b6ebf17ef781fbf151018b2d91e8eda86b9adc55a"} Apr 24 14:25:03.166419 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:03.166385 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:25:03.166601 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:03.166497 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:25:03.166601 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:03.166585 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:25:03.166721 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:03.166696 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:25:03.166776 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:03.166747 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:25:03.166828 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:03.166816 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:25:05.166128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.166082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:25:05.166606 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.166085 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:25:05.166606 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:05.166224 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c5dxc" podUID="27ab69b2-98a6-4382-8921-0b4c9485c514" Apr 24 14:25:05.166606 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.166090 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:25:05.166606 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:05.166294 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hpfkc" podUID="e644e633-8a12-412b-b1a2-812f9fe784ed" Apr 24 14:25:05.166606 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:05.166395 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:25:05.363430 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.363394 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-92.ec2.internal" event="NodeReady" Apr 24 14:25:05.363726 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.363563 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 14:25:05.408545 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.408512 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wvb5j"] Apr 24 14:25:05.430716 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.430628 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t5b97"] Apr 24 14:25:05.430868 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.430803 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.433270 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.433243 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 14:25:05.433464 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.433432 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bxm2\"" Apr 24 14:25:05.433538 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.433530 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 14:25:05.458870 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.458836 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wvb5j"] Apr 24 14:25:05.458870 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.458865 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t5b97"] Apr 24 14:25:05.459073 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.458995 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:05.461184 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.461158 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 14:25:05.461312 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.461193 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfc9t\"" Apr 24 14:25:05.461312 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.461274 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 14:25:05.461431 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.461318 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 14:25:05.499456 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.499425 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4fm2\" (UniqueName: \"kubernetes.io/projected/5625c3db-794f-4f6c-993d-1477fc0a38b8-kube-api-access-f4fm2\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.499629 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.499470 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5625c3db-794f-4f6c-993d-1477fc0a38b8-config-volume\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.499629 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.499541 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.499629 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.499603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5625c3db-794f-4f6c-993d-1477fc0a38b8-tmp-dir\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.600956 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.600921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4fm2\" (UniqueName: \"kubernetes.io/projected/5625c3db-794f-4f6c-993d-1477fc0a38b8-kube-api-access-f4fm2\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.600956 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.600971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5625c3db-794f-4f6c-993d-1477fc0a38b8-config-volume\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.601227 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.601001 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.601227 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.601023 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5625c3db-794f-4f6c-993d-1477fc0a38b8-tmp-dir\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.601227 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.601072 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:05.601227 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.601111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nc64\" (UniqueName: \"kubernetes.io/projected/3b39242d-971a-47cb-9943-42846bc6d8b6-kube-api-access-6nc64\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:05.601423 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:05.601257 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:05.601423 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:05.601345 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls podName:5625c3db-794f-4f6c-993d-1477fc0a38b8 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:06.101322615 +0000 UTC m=+33.539839455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls") pod "dns-default-wvb5j" (UID: "5625c3db-794f-4f6c-993d-1477fc0a38b8") : secret "dns-default-metrics-tls" not found Apr 24 14:25:05.601550 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.601533 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5625c3db-794f-4f6c-993d-1477fc0a38b8-tmp-dir\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.613312 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.613260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5625c3db-794f-4f6c-993d-1477fc0a38b8-config-volume\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.613498 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.613476 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4fm2\" (UniqueName: \"kubernetes.io/projected/5625c3db-794f-4f6c-993d-1477fc0a38b8-kube-api-access-f4fm2\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:05.702533 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.702440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:05.702533 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.702505 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nc64\" (UniqueName: \"kubernetes.io/projected/3b39242d-971a-47cb-9943-42846bc6d8b6-kube-api-access-6nc64\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:05.702760 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:05.702617 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:05.702760 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:05.702694 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert podName:3b39242d-971a-47cb-9943-42846bc6d8b6 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:06.202673534 +0000 UTC m=+33.641190387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert") pod "ingress-canary-t5b97" (UID: "3b39242d-971a-47cb-9943-42846bc6d8b6") : secret "canary-serving-cert" not found Apr 24 14:25:05.716220 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:05.715973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nc64\" (UniqueName: \"kubernetes.io/projected/3b39242d-971a-47cb-9943-42846bc6d8b6-kube-api-access-6nc64\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:06.105510 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:06.105457 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:06.105697 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.105648 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:06.105769 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.105728 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls podName:5625c3db-794f-4f6c-993d-1477fc0a38b8 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:07.10570767 +0000 UTC m=+34.544224519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls") pod "dns-default-wvb5j" (UID: "5625c3db-794f-4f6c-993d-1477fc0a38b8") : secret "dns-default-metrics-tls" not found Apr 24 14:25:06.206964 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:06.206919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:06.207376 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.207070 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:06.207376 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.207164 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert podName:3b39242d-971a-47cb-9943-42846bc6d8b6 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:07.20714245 +0000 UTC m=+34.645659296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert") pod "ingress-canary-t5b97" (UID: "3b39242d-971a-47cb-9943-42846bc6d8b6") : secret "canary-serving-cert" not found Apr 24 14:25:06.811531 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:06.811486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:25:06.811728 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.811659 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:25:06.811788 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.811743 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs podName:bee39eb9-c473-4f55-a88c-427f97349f6c nodeName:}" failed. No retries permitted until 2026-04-24 14:25:38.811722572 +0000 UTC m=+66.250239440 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs") pod "network-metrics-daemon-ncpbf" (UID: "bee39eb9-c473-4f55-a88c-427f97349f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:25:06.912006 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:06.911957 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:25:06.912205 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.912150 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:25:06.912205 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.912175 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:25:06.912205 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.912188 2574 projected.go:194] Error preparing data for projected volume kube-api-access-j49gn for pod openshift-network-diagnostics/network-check-target-c5dxc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:25:06.912384 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:06.912252 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn podName:27ab69b2-98a6-4382-8921-0b4c9485c514 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:38.912230113 +0000 UTC m=+66.350746960 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-j49gn" (UniqueName: "kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn") pod "network-check-target-c5dxc" (UID: "27ab69b2-98a6-4382-8921-0b4c9485c514") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:25:07.114013 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.113924 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:07.114194 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:07.114093 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:07.114194 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:07.114171 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls podName:5625c3db-794f-4f6c-993d-1477fc0a38b8 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:09.114150659 +0000 UTC m=+36.552667520 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls") pod "dns-default-wvb5j" (UID: "5625c3db-794f-4f6c-993d-1477fc0a38b8") : secret "dns-default-metrics-tls" not found Apr 24 14:25:07.166065 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.166024 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:25:07.166248 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.166167 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:25:07.166311 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.166296 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:25:07.168951 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.168927 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:25:07.169191 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.169169 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:25:07.169191 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.169191 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 14:25:07.169398 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.169382 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qh9cm\"" Apr 24 14:25:07.169567 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.169476 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-crqvq\"" Apr 24 14:25:07.169567 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.169480 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:25:07.215031 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:07.214998 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:07.215441 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:07.215134 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:07.215441 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:07.215191 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert podName:3b39242d-971a-47cb-9943-42846bc6d8b6 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:09.21517765 +0000 UTC m=+36.653694488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert") pod "ingress-canary-t5b97" (UID: "3b39242d-971a-47cb-9943-42846bc6d8b6") : secret "canary-serving-cert" not found Apr 24 14:25:09.129956 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:09.129918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:09.130274 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:09.130050 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:09.130274 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:09.130117 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls podName:5625c3db-794f-4f6c-993d-1477fc0a38b8 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:13.130098485 +0000 UTC m=+40.568615327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls") pod "dns-default-wvb5j" (UID: "5625c3db-794f-4f6c-993d-1477fc0a38b8") : secret "dns-default-metrics-tls" not found Apr 24 14:25:09.230689 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:09.230646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:09.230868 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:09.230800 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:09.230942 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:09.230870 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert podName:3b39242d-971a-47cb-9943-42846bc6d8b6 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:13.230853763 +0000 UTC m=+40.669370602 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert") pod "ingress-canary-t5b97" (UID: "3b39242d-971a-47cb-9943-42846bc6d8b6") : secret "canary-serving-cert" not found Apr 24 14:25:09.319077 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:09.319043 2574 generic.go:358] "Generic (PLEG): container finished" podID="dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef" containerID="3f4adb66f8f42a05513c6ffc8a69f0ac6688b00358d7d47cd1dece081befc851" exitCode=0 Apr 24 14:25:09.319231 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:09.319089 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" event={"ID":"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef","Type":"ContainerDied","Data":"3f4adb66f8f42a05513c6ffc8a69f0ac6688b00358d7d47cd1dece081befc851"} Apr 24 14:25:09.331273 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:09.331242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:25:09.334004 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:09.333981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e644e633-8a12-412b-b1a2-812f9fe784ed-original-pull-secret\") pod \"global-pull-secret-syncer-hpfkc\" (UID: \"e644e633-8a12-412b-b1a2-812f9fe784ed\") " pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:25:09.591162 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:09.591122 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hpfkc" Apr 24 14:25:09.751073 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:09.751017 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hpfkc"] Apr 24 14:25:09.755593 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:25:09.755563 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode644e633_8a12_412b_b1a2_812f9fe784ed.slice/crio-1806564fb875fdd200394d5e207e6a1def465a7d7b32be67f96c3d1c8b26d60d WatchSource:0}: Error finding container 1806564fb875fdd200394d5e207e6a1def465a7d7b32be67f96c3d1c8b26d60d: Status 404 returned error can't find the container with id 1806564fb875fdd200394d5e207e6a1def465a7d7b32be67f96c3d1c8b26d60d Apr 24 14:25:10.323474 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:10.323441 2574 generic.go:358] "Generic (PLEG): container finished" podID="dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef" containerID="94e73d40bb1949e0039e303286695760a416d1fb6afb8e2e8c72412fa121427b" exitCode=0 Apr 24 14:25:10.323997 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:10.323502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" event={"ID":"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef","Type":"ContainerDied","Data":"94e73d40bb1949e0039e303286695760a416d1fb6afb8e2e8c72412fa121427b"} Apr 24 14:25:10.324746 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:10.324723 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hpfkc" event={"ID":"e644e633-8a12-412b-b1a2-812f9fe784ed","Type":"ContainerStarted","Data":"1806564fb875fdd200394d5e207e6a1def465a7d7b32be67f96c3d1c8b26d60d"} Apr 24 14:25:11.330850 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:11.330616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" event={"ID":"dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef","Type":"ContainerStarted","Data":"1600dc024052f67b4831aac2352b9d730fd2a191310328b9612c094c8d246b13"} Apr 24 14:25:11.353033 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:11.352978 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wqgmq" podStartSLOduration=5.9206054980000005 podStartE2EDuration="38.352945675s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:35.799023864 +0000 UTC m=+3.237540715" lastFinishedPulling="2026-04-24 14:25:08.231364054 +0000 UTC m=+35.669880892" observedRunningTime="2026-04-24 14:25:11.351941652 +0000 UTC m=+38.790458514" watchObservedRunningTime="2026-04-24 14:25:11.352945675 +0000 UTC m=+38.791462539" Apr 24 14:25:13.159399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.159366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:13.159988 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:13.159511 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:13.159988 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:13.159563 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls podName:5625c3db-794f-4f6c-993d-1477fc0a38b8 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:21.159548478 +0000 UTC m=+48.598065316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls") pod "dns-default-wvb5j" (UID: "5625c3db-794f-4f6c-993d-1477fc0a38b8") : secret "dns-default-metrics-tls" not found Apr 24 14:25:13.228427 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.228391 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f"] Apr 24 14:25:13.231567 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.231537 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n"] Apr 24 14:25:13.231723 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.231642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" Apr 24 14:25:13.234573 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.234536 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 14:25:13.234753 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.234736 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 14:25:13.234829 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.234809 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-qgc6f\"" Apr 24 14:25:13.234963 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.234946 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 14:25:13.236184 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.236163 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 14:25:13.238590 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.238570 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.239801 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.239779 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f"] Apr 24 14:25:13.240817 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.240596 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 14:25:13.240817 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.240638 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 14:25:13.240817 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.240654 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 14:25:13.240817 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.240803 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 14:25:13.246387 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.246365 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n"] Apr 24 14:25:13.260194 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.259837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:13.260194 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:13.260014 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:13.260194 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:13.260075 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert podName:3b39242d-971a-47cb-9943-42846bc6d8b6 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:21.260056278 +0000 UTC m=+48.698573135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert") pod "ingress-canary-t5b97" (UID: "3b39242d-971a-47cb-9943-42846bc6d8b6") : secret "canary-serving-cert" not found Apr 24 14:25:13.361059 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.361035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c398236b-847b-4652-9347-7e74de00123f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.361144 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.361090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-hub\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.361180 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.361143 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.361217 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.361203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ffeb0782-40d4-4ac7-82bd-0a3fdf094363-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f\" (UID: \"ffeb0782-40d4-4ac7-82bd-0a3fdf094363\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" Apr 24 14:25:13.361249 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.361230 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-ca\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.361281 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.361266 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fmx\" (UniqueName: \"kubernetes.io/projected/c398236b-847b-4652-9347-7e74de00123f-kube-api-access-p6fmx\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.361314 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.361298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd56t\" (UniqueName: \"kubernetes.io/projected/ffeb0782-40d4-4ac7-82bd-0a3fdf094363-kube-api-access-jd56t\") pod \"managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f\" (UID: \"ffeb0782-40d4-4ac7-82bd-0a3fdf094363\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" Apr 24 14:25:13.361348 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.361327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.462639 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.462616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fmx\" (UniqueName: \"kubernetes.io/projected/c398236b-847b-4652-9347-7e74de00123f-kube-api-access-p6fmx\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.462752 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.462662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd56t\" (UniqueName: \"kubernetes.io/projected/ffeb0782-40d4-4ac7-82bd-0a3fdf094363-kube-api-access-jd56t\") pod \"managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f\" (UID: \"ffeb0782-40d4-4ac7-82bd-0a3fdf094363\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" Apr 24 14:25:13.462899 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.462860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.462960 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.462921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c398236b-847b-4652-9347-7e74de00123f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.463012 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.462990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-hub\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.463059 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.463020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.463114 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.463077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ffeb0782-40d4-4ac7-82bd-0a3fdf094363-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f\" (UID: \"ffeb0782-40d4-4ac7-82bd-0a3fdf094363\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" Apr 24 14:25:13.463114 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.463107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-ca\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.464098 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.464064 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c398236b-847b-4652-9347-7e74de00123f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.466962 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.466923 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.467068 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.466999 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ffeb0782-40d4-4ac7-82bd-0a3fdf094363-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f\" (UID: \"ffeb0782-40d4-4ac7-82bd-0a3fdf094363\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" Apr 24 14:25:13.467068 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.467015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-ca\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.467189 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.467097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-hub\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.467263 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.467239 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c398236b-847b-4652-9347-7e74de00123f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.471144 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.471116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd56t\" (UniqueName: \"kubernetes.io/projected/ffeb0782-40d4-4ac7-82bd-0a3fdf094363-kube-api-access-jd56t\") pod \"managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f\" (UID: \"ffeb0782-40d4-4ac7-82bd-0a3fdf094363\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" Apr 24 14:25:13.471611 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.471589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fmx\" (UniqueName: \"kubernetes.io/projected/c398236b-847b-4652-9347-7e74de00123f-kube-api-access-p6fmx\") pod \"cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n\" (UID: \"c398236b-847b-4652-9347-7e74de00123f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.557092 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.557052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" Apr 24 14:25:13.563937 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.563906 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:25:13.717348 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.717273 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f"] Apr 24 14:25:13.720437 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:25:13.720409 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffeb0782_40d4_4ac7_82bd_0a3fdf094363.slice/crio-6affd31768967e2607d78a49c391440a51bb8996cd4e27b38a391b53a1b576da WatchSource:0}: Error finding container 6affd31768967e2607d78a49c391440a51bb8996cd4e27b38a391b53a1b576da: Status 404 returned error can't find the container with id 6affd31768967e2607d78a49c391440a51bb8996cd4e27b38a391b53a1b576da Apr 24 14:25:13.727296 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:13.727268 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n"] Apr 24 14:25:13.731280 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:25:13.731257 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc398236b_847b_4652_9347_7e74de00123f.slice/crio-1d9ebeecec659c5ec5b46d04ef0bb1bd073d2bb1861903f6e03afe35339f09ac WatchSource:0}: Error finding container 1d9ebeecec659c5ec5b46d04ef0bb1bd073d2bb1861903f6e03afe35339f09ac: Status 404 returned error can't find the container with id 1d9ebeecec659c5ec5b46d04ef0bb1bd073d2bb1861903f6e03afe35339f09ac Apr 24 14:25:14.339405 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:14.339359 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" event={"ID":"ffeb0782-40d4-4ac7-82bd-0a3fdf094363","Type":"ContainerStarted","Data":"6affd31768967e2607d78a49c391440a51bb8996cd4e27b38a391b53a1b576da"} Apr 24 14:25:14.342527 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:14.342490 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hpfkc" event={"ID":"e644e633-8a12-412b-b1a2-812f9fe784ed","Type":"ContainerStarted","Data":"c8882b49684d727474112aed2aad9566f7ede3bd33a8b82263e70a568467171a"} Apr 24 14:25:14.345249 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:14.345189 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" event={"ID":"c398236b-847b-4652-9347-7e74de00123f","Type":"ContainerStarted","Data":"1d9ebeecec659c5ec5b46d04ef0bb1bd073d2bb1861903f6e03afe35339f09ac"} Apr 24 14:25:17.353163 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:17.353123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" event={"ID":"c398236b-847b-4652-9347-7e74de00123f","Type":"ContainerStarted","Data":"6e82cd9a1176e321c576c0046be8fa85a264f0e4c377793ec5ea3fe3a0b67f84"} Apr 24 14:25:17.354429 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:17.354406 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" event={"ID":"ffeb0782-40d4-4ac7-82bd-0a3fdf094363","Type":"ContainerStarted","Data":"199a692fafbb27bf00aba07ea02baf49714cc86f8eef26495b32da1676e57e65"} Apr 24 14:25:17.369574 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:17.369515 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" podStartSLOduration=1.25631499 podStartE2EDuration="4.369498799s" podCreationTimestamp="2026-04-24 14:25:13 +0000 UTC" firstStartedPulling="2026-04-24 14:25:13.7223758 +0000 UTC m=+41.160892638" lastFinishedPulling="2026-04-24 14:25:16.835559605 +0000 UTC m=+44.274076447" observedRunningTime="2026-04-24 14:25:17.369121292 +0000 UTC m=+44.807638152" watchObservedRunningTime="2026-04-24 14:25:17.369498799 +0000 UTC m=+44.808015659" Apr 24 14:25:17.369753 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:17.369622 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hpfkc" podStartSLOduration=36.774257286 podStartE2EDuration="40.369618221s" podCreationTimestamp="2026-04-24 14:24:37 +0000 UTC" firstStartedPulling="2026-04-24 14:25:09.757495482 +0000 UTC m=+37.196012320" lastFinishedPulling="2026-04-24 14:25:13.352856401 +0000 UTC m=+40.791373255" observedRunningTime="2026-04-24 14:25:14.358601117 +0000 UTC m=+41.797117978" watchObservedRunningTime="2026-04-24 14:25:17.369618221 +0000 UTC m=+44.808135081" Apr 24 14:25:19.360343 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:19.360246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" event={"ID":"c398236b-847b-4652-9347-7e74de00123f","Type":"ContainerStarted","Data":"39ccf352a3a63509f933cf072f116631b633c81ac6b32c4d17949db108a5edb5"} Apr 24 14:25:19.360343 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:19.360285 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" event={"ID":"c398236b-847b-4652-9347-7e74de00123f","Type":"ContainerStarted","Data":"6146a51bed7c2c0205b612164398fa34c5606df137dc569db3dd7b4142f3c696"} Apr 24 14:25:19.378739 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:19.378684 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" podStartSLOduration=1.030383907 podStartE2EDuration="6.378666663s" podCreationTimestamp="2026-04-24 14:25:13 +0000 UTC" firstStartedPulling="2026-04-24 14:25:13.732870066 +0000 UTC m=+41.171386908" lastFinishedPulling="2026-04-24 14:25:19.08115281 +0000 UTC m=+46.519669664" observedRunningTime="2026-04-24 14:25:19.378023999 +0000 UTC m=+46.816540860" watchObservedRunningTime="2026-04-24 14:25:19.378666663 +0000 UTC m=+46.817183523" Apr 24 14:25:21.228020 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:21.227968 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:21.228463 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:21.228322 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:21.228463 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:21.228429 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls podName:5625c3db-794f-4f6c-993d-1477fc0a38b8 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:37.228404832 +0000 UTC m=+64.666921677 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls") pod "dns-default-wvb5j" (UID: "5625c3db-794f-4f6c-993d-1477fc0a38b8") : secret "dns-default-metrics-tls" not found Apr 24 14:25:21.329020 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:21.328965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:21.329194 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:21.329130 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:21.329240 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:21.329207 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert podName:3b39242d-971a-47cb-9943-42846bc6d8b6 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:37.329189914 +0000 UTC m=+64.767706758 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert") pod "ingress-canary-t5b97" (UID: "3b39242d-971a-47cb-9943-42846bc6d8b6") : secret "canary-serving-cert" not found Apr 24 14:25:30.313277 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:30.313246 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hb7mf" Apr 24 14:25:37.241656 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:37.241609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:25:37.242147 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:37.241766 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:37.242147 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:37.241841 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls podName:5625c3db-794f-4f6c-993d-1477fc0a38b8 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:09.241823392 +0000 UTC m=+96.680340230 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls") pod "dns-default-wvb5j" (UID: "5625c3db-794f-4f6c-993d-1477fc0a38b8") : secret "dns-default-metrics-tls" not found Apr 24 14:25:37.342945 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:37.342896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:25:37.343123 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:37.342983 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:37.343123 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:37.343058 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert podName:3b39242d-971a-47cb-9943-42846bc6d8b6 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:09.343041994 +0000 UTC m=+96.781558831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert") pod "ingress-canary-t5b97" (UID: "3b39242d-971a-47cb-9943-42846bc6d8b6") : secret "canary-serving-cert" not found Apr 24 14:25:38.852444 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:38.852400 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:25:38.854659 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:38.854635 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:25:38.863172 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:38.863142 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:25:38.863250 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:25:38.863243 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs podName:bee39eb9-c473-4f55-a88c-427f97349f6c nodeName:}" failed. No retries permitted until 2026-04-24 14:26:42.863220261 +0000 UTC m=+130.301737099 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs") pod "network-metrics-daemon-ncpbf" (UID: "bee39eb9-c473-4f55-a88c-427f97349f6c") : secret "metrics-daemon-secret" not found Apr 24 14:25:38.953284 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:38.953241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:25:38.955462 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:38.955443 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:25:38.965761 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:38.965735 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:25:38.976587 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:38.976565 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49gn\" (UniqueName: \"kubernetes.io/projected/27ab69b2-98a6-4382-8921-0b4c9485c514-kube-api-access-j49gn\") pod \"network-check-target-c5dxc\" (UID: \"27ab69b2-98a6-4382-8921-0b4c9485c514\") " pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:25:38.981303 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:38.981282 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qh9cm\"" Apr 24 14:25:38.989945 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:38.989923 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:25:39.105089 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:39.105010 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-c5dxc"] Apr 24 14:25:39.114201 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:25:39.114173 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27ab69b2_98a6_4382_8921_0b4c9485c514.slice/crio-5381d135dda8e289781dea9d1abfa52cf1c0ca297bd3af942040374d4e503f9b WatchSource:0}: Error finding container 5381d135dda8e289781dea9d1abfa52cf1c0ca297bd3af942040374d4e503f9b: Status 404 returned error can't find the container with id 5381d135dda8e289781dea9d1abfa52cf1c0ca297bd3af942040374d4e503f9b Apr 24 14:25:39.404031 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:39.403952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-c5dxc" event={"ID":"27ab69b2-98a6-4382-8921-0b4c9485c514","Type":"ContainerStarted","Data":"5381d135dda8e289781dea9d1abfa52cf1c0ca297bd3af942040374d4e503f9b"} Apr 24 14:25:42.412592 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:42.412552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-c5dxc" event={"ID":"27ab69b2-98a6-4382-8921-0b4c9485c514","Type":"ContainerStarted","Data":"47f3bd686dfda8dbc6b6e6acd73dfee13c1cd520562fd774ae58308312b47e48"} Apr 24 14:25:42.413072 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:42.412784 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:25:42.428073 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:25:42.428013 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-c5dxc" podStartSLOduration=66.678752755 podStartE2EDuration="1m9.427996697s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:25:39.116573335 +0000 UTC m=+66.555090175" lastFinishedPulling="2026-04-24 14:25:41.865817269 +0000 UTC m=+69.304334117" observedRunningTime="2026-04-24 14:25:42.427824499 +0000 UTC m=+69.866341358" watchObservedRunningTime="2026-04-24 14:25:42.427996697 +0000 UTC m=+69.866513570" Apr 24 14:26:09.269084 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:26:09.269038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:26:09.269616 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:26:09.269233 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:26:09.269616 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:26:09.269323 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls podName:5625c3db-794f-4f6c-993d-1477fc0a38b8 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:13.269301813 +0000 UTC m=+160.707818655 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls") pod "dns-default-wvb5j" (UID: "5625c3db-794f-4f6c-993d-1477fc0a38b8") : secret "dns-default-metrics-tls" not found Apr 24 14:26:09.370310 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:26:09.370184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:26:09.370452 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:26:09.370335 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:26:09.370452 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:26:09.370408 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert podName:3b39242d-971a-47cb-9943-42846bc6d8b6 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:13.370392867 +0000 UTC m=+160.808909708 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert") pod "ingress-canary-t5b97" (UID: "3b39242d-971a-47cb-9943-42846bc6d8b6") : secret "canary-serving-cert" not found Apr 24 14:26:13.418267 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:26:13.418226 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-c5dxc" Apr 24 14:26:42.891671 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:26:42.891636 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dkjpq_267d3672-b74b-4241-80b9-1467f130ddd8/dns-node-resolver/0.log" Apr 24 14:26:42.913980 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:26:42.913946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:26:42.914106 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:26:42.914086 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:26:42.914165 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:26:42.914155 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs podName:bee39eb9-c473-4f55-a88c-427f97349f6c nodeName:}" failed. No retries permitted until 2026-04-24 14:28:44.914138861 +0000 UTC m=+252.352655702 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs") pod "network-metrics-daemon-ncpbf" (UID: "bee39eb9-c473-4f55-a88c-427f97349f6c") : secret "metrics-daemon-secret" not found Apr 24 14:26:43.690652 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:26:43.690623 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6xfh9_96a778f3-6185-45b9-803d-403e973b65b9/node-ca/0.log" Apr 24 14:27:02.703298 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.703267 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-q2c42"] Apr 24 14:27:02.706352 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.706335 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.708840 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.708812 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kg8l8\"" Apr 24 14:27:02.708840 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.708835 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 14:27:02.709051 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.708819 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 14:27:02.709051 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.708812 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 14:27:02.709051 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.708819 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 14:27:02.716163 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.716139 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q2c42"] Apr 24 14:27:02.759991 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.759960 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0a903af0-61c5-40c5-b41e-988f215668f9-data-volume\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.760151 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.760000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0a903af0-61c5-40c5-b41e-988f215668f9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.760151 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.760019 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hlvg\" (UniqueName: \"kubernetes.io/projected/0a903af0-61c5-40c5-b41e-988f215668f9-kube-api-access-7hlvg\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.760151 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.760069 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0a903af0-61c5-40c5-b41e-988f215668f9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.760151 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.760101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0a903af0-61c5-40c5-b41e-988f215668f9-crio-socket\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.860680 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.860644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0a903af0-61c5-40c5-b41e-988f215668f9-data-volume\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.860680 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.860691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0a903af0-61c5-40c5-b41e-988f215668f9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.860928 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.860710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hlvg\" (UniqueName: \"kubernetes.io/projected/0a903af0-61c5-40c5-b41e-988f215668f9-kube-api-access-7hlvg\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.860928 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.860752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0a903af0-61c5-40c5-b41e-988f215668f9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.860928 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.860790 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0a903af0-61c5-40c5-b41e-988f215668f9-crio-socket\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.860928 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.860914 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0a903af0-61c5-40c5-b41e-988f215668f9-crio-socket\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.861699 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.861675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0a903af0-61c5-40c5-b41e-988f215668f9-data-volume\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.861874 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.861798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0a903af0-61c5-40c5-b41e-988f215668f9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.863141 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.863123 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0a903af0-61c5-40c5-b41e-988f215668f9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:02.871826 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:02.871800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hlvg\" (UniqueName: \"kubernetes.io/projected/0a903af0-61c5-40c5-b41e-988f215668f9-kube-api-access-7hlvg\") pod \"insights-runtime-extractor-q2c42\" (UID: \"0a903af0-61c5-40c5-b41e-988f215668f9\") " pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:03.014914 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:03.014808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q2c42" Apr 24 14:27:03.149402 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:03.149370 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q2c42"] Apr 24 14:27:03.153690 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:27:03.153655 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a903af0_61c5_40c5_b41e_988f215668f9.slice/crio-696af302e89174e08edf1cb96505d1bc087e44f154c6598ec16473cbd88fd588 WatchSource:0}: Error finding container 696af302e89174e08edf1cb96505d1bc087e44f154c6598ec16473cbd88fd588: Status 404 returned error can't find the container with id 696af302e89174e08edf1cb96505d1bc087e44f154c6598ec16473cbd88fd588 Apr 24 14:27:03.612791 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:03.612754 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2c42" event={"ID":"0a903af0-61c5-40c5-b41e-988f215668f9","Type":"ContainerStarted","Data":"210ba611955fad8ad6127b088eb2c6da07d599bcaca4c3448dabde3b803fe6d5"} Apr 24 14:27:03.612791 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:03.612794 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2c42" event={"ID":"0a903af0-61c5-40c5-b41e-988f215668f9","Type":"ContainerStarted","Data":"696af302e89174e08edf1cb96505d1bc087e44f154c6598ec16473cbd88fd588"} Apr 24 14:27:04.618011 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:04.617968 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2c42" event={"ID":"0a903af0-61c5-40c5-b41e-988f215668f9","Type":"ContainerStarted","Data":"7580c2eeef283540a27f3e6dcfa4ebf4d02bcd6a2fb012742f2098dc10fd9718"} Apr 24 14:27:05.623450 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:05.623410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2c42" event={"ID":"0a903af0-61c5-40c5-b41e-988f215668f9","Type":"ContainerStarted","Data":"9cf142a96a34b6385e134647c5e980f32e3b93efd27953727352d4391765a674"} Apr 24 14:27:05.642686 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:05.642635 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-q2c42" podStartSLOduration=1.739008529 podStartE2EDuration="3.642619716s" podCreationTimestamp="2026-04-24 14:27:02 +0000 UTC" firstStartedPulling="2026-04-24 14:27:03.216890242 +0000 UTC m=+150.655407095" lastFinishedPulling="2026-04-24 14:27:05.120501441 +0000 UTC m=+152.559018282" observedRunningTime="2026-04-24 14:27:05.642174904 +0000 UTC m=+153.080691787" watchObservedRunningTime="2026-04-24 14:27:05.642619716 +0000 UTC m=+153.081136644" Apr 24 14:27:08.442622 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:27:08.442563 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wvb5j" podUID="5625c3db-794f-4f6c-993d-1477fc0a38b8" Apr 24 14:27:08.468777 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:27:08.468744 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-t5b97" podUID="3b39242d-971a-47cb-9943-42846bc6d8b6" Apr 24 14:27:08.629162 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:08.629128 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wvb5j" Apr 24 14:27:10.186663 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:27:10.186619 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ncpbf" podUID="bee39eb9-c473-4f55-a88c-427f97349f6c" Apr 24 14:27:10.211474 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.211439 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42"] Apr 24 14:27:10.214324 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.214302 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" Apr 24 14:27:10.216586 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.216567 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 14:27:10.216690 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.216572 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-z9kgf\"" Apr 24 14:27:10.222755 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.222732 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42"] Apr 24 14:27:10.314425 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.314385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/be9905d6-7fac-4542-8b90-4622fea9cffa-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vbs42\" (UID: \"be9905d6-7fac-4542-8b90-4622fea9cffa\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" Apr 24 14:27:10.415465 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.415392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/be9905d6-7fac-4542-8b90-4622fea9cffa-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vbs42\" (UID: \"be9905d6-7fac-4542-8b90-4622fea9cffa\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" Apr 24 14:27:10.417754 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.417732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/be9905d6-7fac-4542-8b90-4622fea9cffa-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vbs42\" (UID: \"be9905d6-7fac-4542-8b90-4622fea9cffa\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" Apr 24 14:27:10.523079 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.522976 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" Apr 24 14:27:10.638995 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:10.638965 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42"] Apr 24 14:27:10.643369 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:27:10.643340 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe9905d6_7fac_4542_8b90_4622fea9cffa.slice/crio-3fa76d4b420401b93c484a38730533aae961c447e3aa8ea260a24acbb44cd38c WatchSource:0}: Error finding container 3fa76d4b420401b93c484a38730533aae961c447e3aa8ea260a24acbb44cd38c: Status 404 returned error can't find the container with id 3fa76d4b420401b93c484a38730533aae961c447e3aa8ea260a24acbb44cd38c Apr 24 14:27:11.636301 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:11.636265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" event={"ID":"be9905d6-7fac-4542-8b90-4622fea9cffa","Type":"ContainerStarted","Data":"3fa76d4b420401b93c484a38730533aae961c447e3aa8ea260a24acbb44cd38c"} Apr 24 14:27:12.640019 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:12.639983 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" event={"ID":"be9905d6-7fac-4542-8b90-4622fea9cffa","Type":"ContainerStarted","Data":"af742416f20eb5d5702ab4ae913beb178a539530b5a82964f12ec4ed086b1be1"} Apr 24 14:27:12.640488 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:12.640188 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" Apr 24 14:27:12.644781 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:12.644756 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" Apr 24 14:27:12.655099 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:12.655042 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vbs42" podStartSLOduration=1.648444664 podStartE2EDuration="2.655025918s" podCreationTimestamp="2026-04-24 14:27:10 +0000 UTC" firstStartedPulling="2026-04-24 14:27:10.645077911 +0000 UTC m=+158.083594749" lastFinishedPulling="2026-04-24 14:27:11.651659151 +0000 UTC m=+159.090176003" observedRunningTime="2026-04-24 14:27:12.653876713 +0000 UTC m=+160.092393573" watchObservedRunningTime="2026-04-24 14:27:12.655025918 +0000 UTC m=+160.093542809" Apr 24 14:27:13.339471 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:13.339415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:27:13.341776 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:13.341743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5625c3db-794f-4f6c-993d-1477fc0a38b8-metrics-tls\") pod \"dns-default-wvb5j\" (UID: \"5625c3db-794f-4f6c-993d-1477fc0a38b8\") " pod="openshift-dns/dns-default-wvb5j" Apr 24 14:27:13.431457 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:13.431428 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bxm2\"" Apr 24 14:27:13.440260 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:13.440228 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wvb5j" Apr 24 14:27:13.440429 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:13.440408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:27:13.443541 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:13.443517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b39242d-971a-47cb-9943-42846bc6d8b6-cert\") pod \"ingress-canary-t5b97\" (UID: \"3b39242d-971a-47cb-9943-42846bc6d8b6\") " pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:27:13.557033 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:13.557000 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wvb5j"] Apr 24 14:27:13.560522 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:27:13.560488 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5625c3db_794f_4f6c_993d_1477fc0a38b8.slice/crio-3ad537b996bf035f8f345198962399d5078e67455494b698bec28101c8c10034 WatchSource:0}: Error finding container 3ad537b996bf035f8f345198962399d5078e67455494b698bec28101c8c10034: Status 404 returned error can't find the container with id 3ad537b996bf035f8f345198962399d5078e67455494b698bec28101c8c10034 Apr 24 14:27:13.643691 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:13.643596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wvb5j" event={"ID":"5625c3db-794f-4f6c-993d-1477fc0a38b8","Type":"ContainerStarted","Data":"3ad537b996bf035f8f345198962399d5078e67455494b698bec28101c8c10034"} Apr 24 14:27:15.649989 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:15.649951 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wvb5j" event={"ID":"5625c3db-794f-4f6c-993d-1477fc0a38b8","Type":"ContainerStarted","Data":"c13dd968e09eb93d6324fd198b77ebbbd385661b355da99bc043770b6d9d83e1"} Apr 24 14:27:15.649989 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:15.649992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wvb5j" event={"ID":"5625c3db-794f-4f6c-993d-1477fc0a38b8","Type":"ContainerStarted","Data":"a1f1728d4bb6d5012c96d3a68163fda7f794c5cec3b890d376fd713c5bbf55d9"} Apr 24 14:27:15.650424 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:15.650108 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wvb5j" Apr 24 14:27:15.667578 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:15.667529 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wvb5j" podStartSLOduration=129.456759026 podStartE2EDuration="2m10.667515357s" podCreationTimestamp="2026-04-24 14:25:05 +0000 UTC" firstStartedPulling="2026-04-24 14:27:13.562449347 +0000 UTC m=+161.000966198" lastFinishedPulling="2026-04-24 14:27:14.773205681 +0000 UTC m=+162.211722529" observedRunningTime="2026-04-24 14:27:15.666742848 +0000 UTC m=+163.105259709" watchObservedRunningTime="2026-04-24 14:27:15.667515357 +0000 UTC m=+163.106032217" Apr 24 14:27:17.656158 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.656128 2574 generic.go:358] "Generic (PLEG): container finished" podID="ffeb0782-40d4-4ac7-82bd-0a3fdf094363" containerID="199a692fafbb27bf00aba07ea02baf49714cc86f8eef26495b32da1676e57e65" exitCode=255 Apr 24 14:27:17.656537 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.656177 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" event={"ID":"ffeb0782-40d4-4ac7-82bd-0a3fdf094363","Type":"ContainerDied","Data":"199a692fafbb27bf00aba07ea02baf49714cc86f8eef26495b32da1676e57e65"} Apr 24 14:27:17.656537 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.656455 2574 scope.go:117] "RemoveContainer" containerID="199a692fafbb27bf00aba07ea02baf49714cc86f8eef26495b32da1676e57e65" Apr 24 14:27:17.688678 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.688651 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-p8d6x"] Apr 24 14:27:17.693317 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.693295 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.695117 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.695094 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 14:27:17.695469 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.695449 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6zzcr\"" Apr 24 14:27:17.695469 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.695457 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 14:27:17.695636 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.695566 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 14:27:17.695636 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.695580 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 14:27:17.695769 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.695738 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 14:27:17.695834 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.695788 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 14:27:17.771218 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.771189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-textfile\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.771399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.771267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3dcd29a6-adb7-427f-be07-62c84201acc0-root\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.771399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.771292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-tls\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.771399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.771314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-wtmp\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.771399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.771348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sxvz\" (UniqueName: \"kubernetes.io/projected/3dcd29a6-adb7-427f-be07-62c84201acc0-kube-api-access-5sxvz\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.771563 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.771480 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dcd29a6-adb7-427f-be07-62c84201acc0-sys\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.771563 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.771510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.771659 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.771608 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dcd29a6-adb7-427f-be07-62c84201acc0-metrics-client-ca\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.771715 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.771664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-accelerators-collector-config\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872647 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dcd29a6-adb7-427f-be07-62c84201acc0-metrics-client-ca\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872647 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-accelerators-collector-config\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872941 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-textfile\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872941 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3dcd29a6-adb7-427f-be07-62c84201acc0-root\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872941 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872783 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-tls\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872941 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-wtmp\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872941 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872828 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sxvz\" (UniqueName: \"kubernetes.io/projected/3dcd29a6-adb7-427f-be07-62c84201acc0-kube-api-access-5sxvz\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872941 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dcd29a6-adb7-427f-be07-62c84201acc0-sys\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872941 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3dcd29a6-adb7-427f-be07-62c84201acc0-root\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.872941 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.873325 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:27:17.872955 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 14:27:17.873325 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.872999 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-textfile\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.873325 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.873013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-wtmp\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.873325 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.873003 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dcd29a6-adb7-427f-be07-62c84201acc0-sys\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.873325 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:27:17.873018 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-tls podName:3dcd29a6-adb7-427f-be07-62c84201acc0 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:18.372997788 +0000 UTC m=+165.811514626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-tls") pod "node-exporter-p8d6x" (UID: "3dcd29a6-adb7-427f-be07-62c84201acc0") : secret "node-exporter-tls" not found Apr 24 14:27:17.873496 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.873324 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-accelerators-collector-config\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.873496 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.873369 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dcd29a6-adb7-427f-be07-62c84201acc0-metrics-client-ca\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.875231 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.875211 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:17.883423 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:17.883403 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sxvz\" (UniqueName: \"kubernetes.io/projected/3dcd29a6-adb7-427f-be07-62c84201acc0-kube-api-access-5sxvz\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:18.375165 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:18.375127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-tls\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:18.377434 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:18.377408 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3dcd29a6-adb7-427f-be07-62c84201acc0-node-exporter-tls\") pod \"node-exporter-p8d6x\" (UID: \"3dcd29a6-adb7-427f-be07-62c84201acc0\") " pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:18.602280 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:18.602239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p8d6x" Apr 24 14:27:18.610632 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:27:18.610588 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd29a6_adb7_427f_be07_62c84201acc0.slice/crio-5f703ac465f1fe2a52aebabd621c975f8b9f9c2c32b3fb798d0f07cdb6067815 WatchSource:0}: Error finding container 5f703ac465f1fe2a52aebabd621c975f8b9f9c2c32b3fb798d0f07cdb6067815: Status 404 returned error can't find the container with id 5f703ac465f1fe2a52aebabd621c975f8b9f9c2c32b3fb798d0f07cdb6067815 Apr 24 14:27:18.662417 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:18.660963 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c7f6c5d79-bq98f" event={"ID":"ffeb0782-40d4-4ac7-82bd-0a3fdf094363","Type":"ContainerStarted","Data":"e5eb50d00ed47a4c940ea5cd0933200805b230b1955ea981d30f0b4e933fde63"} Apr 24 14:27:18.663740 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:18.663707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8d6x" event={"ID":"3dcd29a6-adb7-427f-be07-62c84201acc0","Type":"ContainerStarted","Data":"5f703ac465f1fe2a52aebabd621c975f8b9f9c2c32b3fb798d0f07cdb6067815"} Apr 24 14:27:19.669893 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:19.669786 2574 generic.go:358] "Generic (PLEG): container finished" podID="3dcd29a6-adb7-427f-be07-62c84201acc0" containerID="6f52b6f3f59d734971bf48af90fe908d1dc22a8d01bd012f4dcba7193199a2f3" exitCode=0 Apr 24 14:27:19.670332 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:19.669870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8d6x" event={"ID":"3dcd29a6-adb7-427f-be07-62c84201acc0","Type":"ContainerDied","Data":"6f52b6f3f59d734971bf48af90fe908d1dc22a8d01bd012f4dcba7193199a2f3"} Apr 24 14:27:20.674844 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.674808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8d6x" event={"ID":"3dcd29a6-adb7-427f-be07-62c84201acc0","Type":"ContainerStarted","Data":"e8fab1648a4f3c375f8abaaca8683472c99e3b1591cdf04c88933ca2f77d916d"} Apr 24 14:27:20.674844 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.674844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8d6x" event={"ID":"3dcd29a6-adb7-427f-be07-62c84201acc0","Type":"ContainerStarted","Data":"c14dbaaaa9172989cccc00c2b4ed43b77326a9f5308dfe7f2f0097905c4fec43"} Apr 24 14:27:20.693565 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.693515 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-p8d6x" podStartSLOduration=2.922451358 podStartE2EDuration="3.693499938s" podCreationTimestamp="2026-04-24 14:27:17 +0000 UTC" firstStartedPulling="2026-04-24 14:27:18.612396712 +0000 UTC m=+166.050913550" lastFinishedPulling="2026-04-24 14:27:19.383445292 +0000 UTC m=+166.821962130" observedRunningTime="2026-04-24 14:27:20.692155113 +0000 UTC m=+168.130671974" watchObservedRunningTime="2026-04-24 14:27:20.693499938 +0000 UTC m=+168.132016798" Apr 24 14:27:20.753076 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.753043 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7c496764cd-nnbcv"] Apr 24 14:27:20.756451 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.756433 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.758554 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.758534 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-qdpddcq3aof0\"" Apr 24 14:27:20.758684 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.758563 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 14:27:20.758814 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.758774 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 14:27:20.758912 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.758817 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 14:27:20.758912 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.758813 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 14:27:20.758912 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.758904 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-ncq9n\"" Apr 24 14:27:20.759058 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.758951 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 14:27:20.767947 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.767925 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7c496764cd-nnbcv"] Apr 24 14:27:20.794327 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.794285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.794327 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.794331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.794655 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.794642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndcd\" (UniqueName: \"kubernetes.io/projected/cd322932-bc7a-4baa-b093-8310b1da5e93-kube-api-access-xndcd\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.794705 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.794684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-tls\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.794745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.794719 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.794745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.794739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-grpc-tls\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.794815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.794776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd322932-bc7a-4baa-b093-8310b1da5e93-metrics-client-ca\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.794815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.794795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.896213 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.896174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xndcd\" (UniqueName: \"kubernetes.io/projected/cd322932-bc7a-4baa-b093-8310b1da5e93-kube-api-access-xndcd\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.896359 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.896224 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-tls\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.896359 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.896262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.896460 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.896345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-grpc-tls\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.896460 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.896417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd322932-bc7a-4baa-b093-8310b1da5e93-metrics-client-ca\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.896563 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.896462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.896563 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.896514 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.896688 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.896660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.897231 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.897197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd322932-bc7a-4baa-b093-8310b1da5e93-metrics-client-ca\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.899106 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.899078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.899218 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.899079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.899321 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.899303 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.899414 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.899391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.899469 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.899421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-grpc-tls\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.899469 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.899446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cd322932-bc7a-4baa-b093-8310b1da5e93-secret-thanos-querier-tls\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:20.904847 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:20.904826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndcd\" (UniqueName: \"kubernetes.io/projected/cd322932-bc7a-4baa-b093-8310b1da5e93-kube-api-access-xndcd\") pod \"thanos-querier-7c496764cd-nnbcv\" (UID: \"cd322932-bc7a-4baa-b093-8310b1da5e93\") " pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:21.065959 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:21.065917 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:21.185799 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:21.185765 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7c496764cd-nnbcv"] Apr 24 14:27:21.189715 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:27:21.189684 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd322932_bc7a_4baa_b093_8310b1da5e93.slice/crio-288d3089f1105b0f6c212228d8b03c0cca76bcbeb99dc6da2b85ceffc6158085 WatchSource:0}: Error finding container 288d3089f1105b0f6c212228d8b03c0cca76bcbeb99dc6da2b85ceffc6158085: Status 404 returned error can't find the container with id 288d3089f1105b0f6c212228d8b03c0cca76bcbeb99dc6da2b85ceffc6158085 Apr 24 14:27:21.678197 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:21.678159 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" event={"ID":"cd322932-bc7a-4baa-b093-8310b1da5e93","Type":"ContainerStarted","Data":"288d3089f1105b0f6c212228d8b03c0cca76bcbeb99dc6da2b85ceffc6158085"} Apr 24 14:27:22.166363 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.166326 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:27:22.168438 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.168416 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfc9t\"" Apr 24 14:27:22.177085 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.177047 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t5b97" Apr 24 14:27:22.183745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.183718 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7"] Apr 24 14:27:22.187498 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.187474 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.189561 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.189534 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 14:27:22.189672 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.189622 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ft5qtimrbbubp\"" Apr 24 14:27:22.190044 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.189989 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 14:27:22.190315 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.190147 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-nkkrg\"" Apr 24 14:27:22.190315 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.190220 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 14:27:22.190315 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.190225 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 14:27:22.198591 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.198565 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7"] Apr 24 14:27:22.308532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.308496 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0a95f466-fd08-40e3-9338-b67203aa3373-secret-metrics-server-client-certs\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.308707 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.308547 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95f466-fd08-40e3-9338-b67203aa3373-client-ca-bundle\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.308707 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.308640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0a95f466-fd08-40e3-9338-b67203aa3373-audit-log\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.308818 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.308716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0a95f466-fd08-40e3-9338-b67203aa3373-metrics-server-audit-profiles\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.308818 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.308754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0a95f466-fd08-40e3-9338-b67203aa3373-secret-metrics-server-tls\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.308906 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.308827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a95f466-fd08-40e3-9338-b67203aa3373-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.308906 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.308896 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kz28\" (UniqueName: \"kubernetes.io/projected/0a95f466-fd08-40e3-9338-b67203aa3373-kube-api-access-2kz28\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.317101 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.317074 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t5b97"] Apr 24 14:27:22.409630 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.409595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0a95f466-fd08-40e3-9338-b67203aa3373-secret-metrics-server-client-certs\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.409792 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.409655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95f466-fd08-40e3-9338-b67203aa3373-client-ca-bundle\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.409792 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.409712 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0a95f466-fd08-40e3-9338-b67203aa3373-audit-log\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.409792 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.409760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0a95f466-fd08-40e3-9338-b67203aa3373-metrics-server-audit-profiles\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.409792 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.409787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0a95f466-fd08-40e3-9338-b67203aa3373-secret-metrics-server-tls\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.410012 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.409830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a95f466-fd08-40e3-9338-b67203aa3373-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.410012 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.409861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kz28\" (UniqueName: \"kubernetes.io/projected/0a95f466-fd08-40e3-9338-b67203aa3373-kube-api-access-2kz28\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.410324 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.410298 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0a95f466-fd08-40e3-9338-b67203aa3373-audit-log\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.410702 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.410656 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a95f466-fd08-40e3-9338-b67203aa3373-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.411241 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.411206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0a95f466-fd08-40e3-9338-b67203aa3373-metrics-server-audit-profiles\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.412724 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.412684 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0a95f466-fd08-40e3-9338-b67203aa3373-secret-metrics-server-client-certs\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.412818 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.412744 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95f466-fd08-40e3-9338-b67203aa3373-client-ca-bundle\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.412901 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.412867 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0a95f466-fd08-40e3-9338-b67203aa3373-secret-metrics-server-tls\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.419353 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.419288 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kz28\" (UniqueName: \"kubernetes.io/projected/0a95f466-fd08-40e3-9338-b67203aa3373-kube-api-access-2kz28\") pod \"metrics-server-7fcdb4cbc7-jhkk7\" (UID: \"0a95f466-fd08-40e3-9338-b67203aa3373\") " pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.511932 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.511894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:22.779637 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:27:22.779559 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b39242d_971a_47cb_9943_42846bc6d8b6.slice/crio-83c7748ca5c914166f858b4267fc75b305779d96b9af37ed9851bf9faf2007c2 WatchSource:0}: Error finding container 83c7748ca5c914166f858b4267fc75b305779d96b9af37ed9851bf9faf2007c2: Status 404 returned error can't find the container with id 83c7748ca5c914166f858b4267fc75b305779d96b9af37ed9851bf9faf2007c2 Apr 24 14:27:22.912586 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:22.912537 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7"] Apr 24 14:27:22.917132 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:27:22.917098 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a95f466_fd08_40e3_9338_b67203aa3373.slice/crio-e91f1d95ccda9c5adc3f33838d36e0cbb3de9544c83a06a88a89effcfc0ca0f5 WatchSource:0}: Error finding container e91f1d95ccda9c5adc3f33838d36e0cbb3de9544c83a06a88a89effcfc0ca0f5: Status 404 returned error can't find the container with id e91f1d95ccda9c5adc3f33838d36e0cbb3de9544c83a06a88a89effcfc0ca0f5 Apr 24 14:27:23.690645 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:23.690606 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" event={"ID":"cd322932-bc7a-4baa-b093-8310b1da5e93","Type":"ContainerStarted","Data":"9a9aeb98aa2775ec7820e586cf4e0c53983b623d5f2c8070ba83fd62812040a9"} Apr 24 14:27:23.690912 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:23.690652 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" event={"ID":"cd322932-bc7a-4baa-b093-8310b1da5e93","Type":"ContainerStarted","Data":"b4bf59fbdff4d3bdfe33da279e3364c5e18f99ce280bc823b092e5aca733151c"} Apr 24 14:27:23.690912 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:23.690677 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" event={"ID":"cd322932-bc7a-4baa-b093-8310b1da5e93","Type":"ContainerStarted","Data":"5f3b108627e138d0eebaf214a3ca41229c4d6112f4cacd665ec9e7acc61c4058"} Apr 24 14:27:23.692364 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:23.692328 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" event={"ID":"0a95f466-fd08-40e3-9338-b67203aa3373","Type":"ContainerStarted","Data":"e91f1d95ccda9c5adc3f33838d36e0cbb3de9544c83a06a88a89effcfc0ca0f5"} Apr 24 14:27:23.693812 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:23.693768 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t5b97" event={"ID":"3b39242d-971a-47cb-9943-42846bc6d8b6","Type":"ContainerStarted","Data":"83c7748ca5c914166f858b4267fc75b305779d96b9af37ed9851bf9faf2007c2"} Apr 24 14:27:24.165926 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:24.165870 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:27:24.700264 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:24.700170 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" event={"ID":"cd322932-bc7a-4baa-b093-8310b1da5e93","Type":"ContainerStarted","Data":"a394e45bd99cf1ddfd33b61740ff887e8e666a2e9f4e34df43338fd9395b5a99"} Apr 24 14:27:24.700264 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:24.700213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" event={"ID":"cd322932-bc7a-4baa-b093-8310b1da5e93","Type":"ContainerStarted","Data":"686433b96f48633f8490b2f06eb59d08c9f6a9e1422828e8f3e6c5e51d3f3c61"} Apr 24 14:27:25.655065 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:25.655033 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wvb5j" Apr 24 14:27:25.706009 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:25.705969 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" event={"ID":"cd322932-bc7a-4baa-b093-8310b1da5e93","Type":"ContainerStarted","Data":"151e9eefb47656d50b148f2e582a42aaeb88cbb24747fdf071ddfce7200d27ea"} Apr 24 14:27:25.706212 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:25.706189 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:25.707411 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:25.707382 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" event={"ID":"0a95f466-fd08-40e3-9338-b67203aa3373","Type":"ContainerStarted","Data":"4252b733c0e33e36bc819e947f19c573fe5a98af712cc0dd6a13d62b1fbe296a"} Apr 24 14:27:25.708757 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:25.708732 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t5b97" event={"ID":"3b39242d-971a-47cb-9943-42846bc6d8b6","Type":"ContainerStarted","Data":"fdab68a7d34cc4c6616449f59748c79c04603fe7010353489166b15aa1061892"} Apr 24 14:27:25.729148 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:25.729102 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" podStartSLOduration=3.019612699 podStartE2EDuration="5.729085391s" podCreationTimestamp="2026-04-24 14:27:20 +0000 UTC" firstStartedPulling="2026-04-24 14:27:21.191590919 +0000 UTC m=+168.630107757" lastFinishedPulling="2026-04-24 14:27:23.901063597 +0000 UTC m=+171.339580449" observedRunningTime="2026-04-24 14:27:25.728785322 +0000 UTC m=+173.167302182" watchObservedRunningTime="2026-04-24 14:27:25.729085391 +0000 UTC m=+173.167602251" Apr 24 14:27:25.749042 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:25.748990 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t5b97" podStartSLOduration=138.648179888 podStartE2EDuration="2m20.748975494s" podCreationTimestamp="2026-04-24 14:25:05 +0000 UTC" firstStartedPulling="2026-04-24 14:27:22.781718318 +0000 UTC m=+170.220235156" lastFinishedPulling="2026-04-24 14:27:24.88251391 +0000 UTC m=+172.321030762" observedRunningTime="2026-04-24 14:27:25.748538336 +0000 UTC m=+173.187055198" watchObservedRunningTime="2026-04-24 14:27:25.748975494 +0000 UTC m=+173.187492353" Apr 24 14:27:25.767739 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:25.767686 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" podStartSLOduration=1.8073340390000001 podStartE2EDuration="3.76766556s" podCreationTimestamp="2026-04-24 14:27:22 +0000 UTC" firstStartedPulling="2026-04-24 14:27:22.919176842 +0000 UTC m=+170.357693686" lastFinishedPulling="2026-04-24 14:27:24.879508366 +0000 UTC m=+172.318025207" observedRunningTime="2026-04-24 14:27:25.766935942 +0000 UTC m=+173.205452805" watchObservedRunningTime="2026-04-24 14:27:25.76766556 +0000 UTC m=+173.206182425" Apr 24 14:27:31.717535 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:31.717506 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7c496764cd-nnbcv" Apr 24 14:27:40.720079 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.720042 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55c95fffb5-jgmqf"] Apr 24 14:27:40.722801 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.722784 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.725420 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.725389 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 14:27:40.725541 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.725471 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 14:27:40.725541 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.725478 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gw62q\"" Apr 24 14:27:40.725541 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.725504 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 14:27:40.725541 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.725522 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 14:27:40.725541 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.725540 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 14:27:40.725802 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.725470 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 14:27:40.725802 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.725654 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 14:27:40.734783 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.734763 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55c95fffb5-jgmqf"] Apr 24 14:27:40.875291 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.875252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-oauth-serving-cert\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.875467 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.875302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-service-ca\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.875467 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.875352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-serving-cert\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.875467 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.875373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-oauth-config\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.875467 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.875416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-config\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.875467 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.875449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4l9k\" (UniqueName: \"kubernetes.io/projected/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-kube-api-access-d4l9k\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.976826 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.976735 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-oauth-serving-cert\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.976826 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.976780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-service-ca\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.977031 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.976963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-serving-cert\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.977031 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.976996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-oauth-config\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.977031 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.977022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-config\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.977177 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.977059 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4l9k\" (UniqueName: \"kubernetes.io/projected/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-kube-api-access-d4l9k\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.977646 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.977620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-config\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.978521 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.978492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-oauth-serving-cert\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.979436 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.979416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-oauth-config\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.979521 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.979505 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-serving-cert\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.985975 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.985949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-service-ca\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:40.986510 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:40.986492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4l9k\" (UniqueName: \"kubernetes.io/projected/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-kube-api-access-d4l9k\") pod \"console-55c95fffb5-jgmqf\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:41.032652 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:41.032611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:41.165812 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:41.165780 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55c95fffb5-jgmqf"] Apr 24 14:27:41.169240 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:27:41.169213 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde6e9176_74bc_4ca7_b62f_a50843fc8cb7.slice/crio-46e701455b9bef1feb339d2a2ad238693331d0f7e1167871178ddb346902a4fe WatchSource:0}: Error finding container 46e701455b9bef1feb339d2a2ad238693331d0f7e1167871178ddb346902a4fe: Status 404 returned error can't find the container with id 46e701455b9bef1feb339d2a2ad238693331d0f7e1167871178ddb346902a4fe Apr 24 14:27:41.751279 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:41.751237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c95fffb5-jgmqf" event={"ID":"de6e9176-74bc-4ca7-b62f-a50843fc8cb7","Type":"ContainerStarted","Data":"46e701455b9bef1feb339d2a2ad238693331d0f7e1167871178ddb346902a4fe"} Apr 24 14:27:42.512446 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:42.512405 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:42.512446 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:42.512448 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:27:43.565704 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:43.565642 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" podUID="c398236b-847b-4652-9347-7e74de00123f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 14:27:44.766795 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:44.766755 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c95fffb5-jgmqf" event={"ID":"de6e9176-74bc-4ca7-b62f-a50843fc8cb7","Type":"ContainerStarted","Data":"11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9"} Apr 24 14:27:44.784708 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:44.784656 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55c95fffb5-jgmqf" podStartSLOduration=2.190841552 podStartE2EDuration="4.784641713s" podCreationTimestamp="2026-04-24 14:27:40 +0000 UTC" firstStartedPulling="2026-04-24 14:27:41.170869861 +0000 UTC m=+188.609386699" lastFinishedPulling="2026-04-24 14:27:43.764670019 +0000 UTC m=+191.203186860" observedRunningTime="2026-04-24 14:27:44.783177973 +0000 UTC m=+192.221694833" watchObservedRunningTime="2026-04-24 14:27:44.784641713 +0000 UTC m=+192.223158573" Apr 24 14:27:51.033041 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:51.033004 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:51.033041 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:51.033050 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:51.038423 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:51.038396 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:51.791897 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:51.791854 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:27:53.565312 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:27:53.565271 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" podUID="c398236b-847b-4652-9347-7e74de00123f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 14:28:00.184545 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:00.184505 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55c95fffb5-jgmqf"] Apr 24 14:28:02.517555 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:02.517526 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:28:02.521940 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:02.521909 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7fcdb4cbc7-jhkk7" Apr 24 14:28:03.564754 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:03.564718 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" podUID="c398236b-847b-4652-9347-7e74de00123f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 14:28:03.565112 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:03.564791 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" Apr 24 14:28:03.565304 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:03.565273 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"39ccf352a3a63509f933cf072f116631b633c81ac6b32c4d17949db108a5edb5"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 14:28:03.565341 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:03.565325 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" podUID="c398236b-847b-4652-9347-7e74de00123f" containerName="service-proxy" containerID="cri-o://39ccf352a3a63509f933cf072f116631b633c81ac6b32c4d17949db108a5edb5" gracePeriod=30 Apr 24 14:28:03.820236 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:03.820149 2574 generic.go:358] "Generic (PLEG): container finished" podID="c398236b-847b-4652-9347-7e74de00123f" containerID="39ccf352a3a63509f933cf072f116631b633c81ac6b32c4d17949db108a5edb5" exitCode=2 Apr 24 14:28:03.820236 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:03.820217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" event={"ID":"c398236b-847b-4652-9347-7e74de00123f","Type":"ContainerDied","Data":"39ccf352a3a63509f933cf072f116631b633c81ac6b32c4d17949db108a5edb5"} Apr 24 14:28:03.820413 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:03.820257 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7fcbb8bfdb-gtg6n" event={"ID":"c398236b-847b-4652-9347-7e74de00123f","Type":"ContainerStarted","Data":"b3e990df0ea1f32fb56789602d14e58ad99aa1ef0dbc48bfe9a36b45afa3ffa5"} Apr 24 14:28:25.203779 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.203708 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55c95fffb5-jgmqf" podUID="de6e9176-74bc-4ca7-b62f-a50843fc8cb7" containerName="console" containerID="cri-o://11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9" gracePeriod=15 Apr 24 14:28:25.445208 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.445185 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55c95fffb5-jgmqf_de6e9176-74bc-4ca7-b62f-a50843fc8cb7/console/0.log" Apr 24 14:28:25.445312 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.445261 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:28:25.550399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.550299 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-serving-cert\") pod \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " Apr 24 14:28:25.550399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.550373 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4l9k\" (UniqueName: \"kubernetes.io/projected/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-kube-api-access-d4l9k\") pod \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " Apr 24 14:28:25.550399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.550396 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-config\") pod \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " Apr 24 14:28:25.550689 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.550428 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-oauth-serving-cert\") pod \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " Apr 24 14:28:25.550689 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.550452 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-oauth-config\") pod \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " Apr 24 14:28:25.550689 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.550487 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-service-ca\") pod \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\" (UID: \"de6e9176-74bc-4ca7-b62f-a50843fc8cb7\") " Apr 24 14:28:25.550857 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.550818 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-config" (OuterVolumeSpecName: "console-config") pod "de6e9176-74bc-4ca7-b62f-a50843fc8cb7" (UID: "de6e9176-74bc-4ca7-b62f-a50843fc8cb7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:25.550933 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.550847 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "de6e9176-74bc-4ca7-b62f-a50843fc8cb7" (UID: "de6e9176-74bc-4ca7-b62f-a50843fc8cb7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:25.551002 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.550950 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-service-ca" (OuterVolumeSpecName: "service-ca") pod "de6e9176-74bc-4ca7-b62f-a50843fc8cb7" (UID: "de6e9176-74bc-4ca7-b62f-a50843fc8cb7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:25.552725 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.552699 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-kube-api-access-d4l9k" (OuterVolumeSpecName: "kube-api-access-d4l9k") pod "de6e9176-74bc-4ca7-b62f-a50843fc8cb7" (UID: "de6e9176-74bc-4ca7-b62f-a50843fc8cb7"). InnerVolumeSpecName "kube-api-access-d4l9k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:28:25.552933 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.552726 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "de6e9176-74bc-4ca7-b62f-a50843fc8cb7" (UID: "de6e9176-74bc-4ca7-b62f-a50843fc8cb7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:25.552933 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.552783 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "de6e9176-74bc-4ca7-b62f-a50843fc8cb7" (UID: "de6e9176-74bc-4ca7-b62f-a50843fc8cb7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:25.651064 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.651022 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4l9k\" (UniqueName: \"kubernetes.io/projected/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-kube-api-access-d4l9k\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:28:25.651064 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.651052 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-config\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:28:25.651064 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.651062 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-oauth-serving-cert\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:28:25.651064 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.651071 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-oauth-config\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:28:25.651318 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.651081 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-service-ca\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:28:25.651318 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.651090 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de6e9176-74bc-4ca7-b62f-a50843fc8cb7-console-serving-cert\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:28:25.878368 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.878340 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55c95fffb5-jgmqf_de6e9176-74bc-4ca7-b62f-a50843fc8cb7/console/0.log" Apr 24 14:28:25.878540 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.878379 2574 generic.go:358] "Generic (PLEG): container finished" podID="de6e9176-74bc-4ca7-b62f-a50843fc8cb7" containerID="11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9" exitCode=2 Apr 24 14:28:25.878540 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.878408 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c95fffb5-jgmqf" event={"ID":"de6e9176-74bc-4ca7-b62f-a50843fc8cb7","Type":"ContainerDied","Data":"11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9"} Apr 24 14:28:25.878540 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.878442 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c95fffb5-jgmqf" Apr 24 14:28:25.878540 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.878455 2574 scope.go:117] "RemoveContainer" containerID="11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9" Apr 24 14:28:25.878540 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.878445 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c95fffb5-jgmqf" event={"ID":"de6e9176-74bc-4ca7-b62f-a50843fc8cb7","Type":"ContainerDied","Data":"46e701455b9bef1feb339d2a2ad238693331d0f7e1167871178ddb346902a4fe"} Apr 24 14:28:25.886504 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.886490 2574 scope.go:117] "RemoveContainer" containerID="11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9" Apr 24 14:28:25.886769 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:28:25.886745 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9\": container with ID starting with 11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9 not found: ID does not exist" containerID="11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9" Apr 24 14:28:25.886853 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.886776 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9"} err="failed to get container status \"11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9\": rpc error: code = NotFound desc = could not find container \"11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9\": container with ID starting with 11be3ec218056622fdb3d00c3f0354d49ca13fac4f87bfbf15b9bcd857fbb2a9 not found: ID does not exist" Apr 24 14:28:25.901932 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.899580 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55c95fffb5-jgmqf"] Apr 24 14:28:25.903631 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:25.903608 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55c95fffb5-jgmqf"] Apr 24 14:28:27.170439 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:27.170402 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6e9176-74bc-4ca7-b62f-a50843fc8cb7" path="/var/lib/kubelet/pods/de6e9176-74bc-4ca7-b62f-a50843fc8cb7/volumes" Apr 24 14:28:44.917384 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:44.917345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:28:44.919647 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:44.919622 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee39eb9-c473-4f55-a88c-427f97349f6c-metrics-certs\") pod \"network-metrics-daemon-ncpbf\" (UID: \"bee39eb9-c473-4f55-a88c-427f97349f6c\") " pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:28:45.172539 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:45.169428 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-crqvq\"" Apr 24 14:28:45.176767 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:45.176743 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ncpbf" Apr 24 14:28:45.298037 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:45.298000 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ncpbf"] Apr 24 14:28:45.301648 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:28:45.301615 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbee39eb9_c473_4f55_a88c_427f97349f6c.slice/crio-af91c3c9998ca2454508d48c1db4421e6ecd9e62d16887806ea86f20bd2f994c WatchSource:0}: Error finding container af91c3c9998ca2454508d48c1db4421e6ecd9e62d16887806ea86f20bd2f994c: Status 404 returned error can't find the container with id af91c3c9998ca2454508d48c1db4421e6ecd9e62d16887806ea86f20bd2f994c Apr 24 14:28:45.936654 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:45.936612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ncpbf" event={"ID":"bee39eb9-c473-4f55-a88c-427f97349f6c","Type":"ContainerStarted","Data":"af91c3c9998ca2454508d48c1db4421e6ecd9e62d16887806ea86f20bd2f994c"} Apr 24 14:28:46.941532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:46.941490 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ncpbf" event={"ID":"bee39eb9-c473-4f55-a88c-427f97349f6c","Type":"ContainerStarted","Data":"5d3a0644ffdc3f71e5dc66c17937b4184ede032547b85f0ef954fb347d312e0e"} Apr 24 14:28:46.941532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:46.941530 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ncpbf" event={"ID":"bee39eb9-c473-4f55-a88c-427f97349f6c","Type":"ContainerStarted","Data":"36e0b9cc47df35965f458822c142bd93d775aeb52adb0f90a3fe51c2966f6693"} Apr 24 14:28:46.959470 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:28:46.959417 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ncpbf" podStartSLOduration=252.99614315 podStartE2EDuration="4m13.959398122s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:28:45.303497997 +0000 UTC m=+252.742014842" lastFinishedPulling="2026-04-24 14:28:46.266752976 +0000 UTC m=+253.705269814" observedRunningTime="2026-04-24 14:28:46.957983759 +0000 UTC m=+254.396500620" watchObservedRunningTime="2026-04-24 14:28:46.959398122 +0000 UTC m=+254.397914981" Apr 24 14:29:33.063822 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:29:33.063795 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 14:31:12.767377 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.767338 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld"] Apr 24 14:31:12.767810 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.767588 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de6e9176-74bc-4ca7-b62f-a50843fc8cb7" containerName="console" Apr 24 14:31:12.767810 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.767599 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6e9176-74bc-4ca7-b62f-a50843fc8cb7" containerName="console" Apr 24 14:31:12.767810 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.767653 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="de6e9176-74bc-4ca7-b62f-a50843fc8cb7" containerName="console" Apr 24 14:31:12.770443 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.770426 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:12.773082 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.773055 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jlr2j\"" Apr 24 14:31:12.773281 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.773250 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 14:31:12.773348 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.773237 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 14:31:12.779994 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.779946 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld"] Apr 24 14:31:12.830860 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.830818 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mhfs\" (UniqueName: \"kubernetes.io/projected/66c1ead7-7fe4-40c5-b775-f89621aa42fa-kube-api-access-5mhfs\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:12.830860 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.830864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:12.831129 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.830992 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:12.932107 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.932064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:12.932288 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.932124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mhfs\" (UniqueName: \"kubernetes.io/projected/66c1ead7-7fe4-40c5-b775-f89621aa42fa-kube-api-access-5mhfs\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:12.932288 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.932153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:12.932490 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.932471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:12.932582 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.932562 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:12.942183 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:12.942153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mhfs\" (UniqueName: \"kubernetes.io/projected/66c1ead7-7fe4-40c5-b775-f89621aa42fa-kube-api-access-5mhfs\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:13.083410 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:13.083376 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:13.207118 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:13.207093 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld"] Apr 24 14:31:13.209632 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:31:13.209605 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c1ead7_7fe4_40c5_b775_f89621aa42fa.slice/crio-36d8324fe2b692afe88aa6ad63e00baf34257eb927d35a1d0e685562c419ae03 WatchSource:0}: Error finding container 36d8324fe2b692afe88aa6ad63e00baf34257eb927d35a1d0e685562c419ae03: Status 404 returned error can't find the container with id 36d8324fe2b692afe88aa6ad63e00baf34257eb927d35a1d0e685562c419ae03 Apr 24 14:31:13.211361 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:13.211346 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:31:13.341904 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:13.341797 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" event={"ID":"66c1ead7-7fe4-40c5-b775-f89621aa42fa","Type":"ContainerStarted","Data":"36d8324fe2b692afe88aa6ad63e00baf34257eb927d35a1d0e685562c419ae03"} Apr 24 14:31:20.362585 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:20.362545 2574 generic.go:358] "Generic (PLEG): container finished" podID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerID="e5884d25399b3dccd313cd1b28825834523416370af55a23b8d0fa7dc932624a" exitCode=0 Apr 24 14:31:20.363117 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:20.362637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" event={"ID":"66c1ead7-7fe4-40c5-b775-f89621aa42fa","Type":"ContainerDied","Data":"e5884d25399b3dccd313cd1b28825834523416370af55a23b8d0fa7dc932624a"} Apr 24 14:31:24.375639 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:24.375603 2574 generic.go:358] "Generic (PLEG): container finished" podID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerID="e1583dabb738b0c84cdd3e16e10aec6ebfa8d35811359d5f4eeb8f43afb8f41a" exitCode=0 Apr 24 14:31:24.376039 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:24.375653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" event={"ID":"66c1ead7-7fe4-40c5-b775-f89621aa42fa","Type":"ContainerDied","Data":"e1583dabb738b0c84cdd3e16e10aec6ebfa8d35811359d5f4eeb8f43afb8f41a"} Apr 24 14:31:31.397949 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:31.397837 2574 generic.go:358] "Generic (PLEG): container finished" podID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerID="c1e3604be34034b623b343a16afcc753ad549e2c61c387d20ec5cd3d1836194e" exitCode=0 Apr 24 14:31:31.397949 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:31.397932 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" event={"ID":"66c1ead7-7fe4-40c5-b775-f89621aa42fa","Type":"ContainerDied","Data":"c1e3604be34034b623b343a16afcc753ad549e2c61c387d20ec5cd3d1836194e"} Apr 24 14:31:32.518979 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.518957 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:32.594211 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.594173 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-bundle\") pod \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " Apr 24 14:31:32.594211 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.594212 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-util\") pod \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " Apr 24 14:31:32.594465 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.594284 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mhfs\" (UniqueName: \"kubernetes.io/projected/66c1ead7-7fe4-40c5-b775-f89621aa42fa-kube-api-access-5mhfs\") pod \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\" (UID: \"66c1ead7-7fe4-40c5-b775-f89621aa42fa\") " Apr 24 14:31:32.594776 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.594754 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-bundle" (OuterVolumeSpecName: "bundle") pod "66c1ead7-7fe4-40c5-b775-f89621aa42fa" (UID: "66c1ead7-7fe4-40c5-b775-f89621aa42fa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:31:32.596531 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.596489 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c1ead7-7fe4-40c5-b775-f89621aa42fa-kube-api-access-5mhfs" (OuterVolumeSpecName: "kube-api-access-5mhfs") pod "66c1ead7-7fe4-40c5-b775-f89621aa42fa" (UID: "66c1ead7-7fe4-40c5-b775-f89621aa42fa"). InnerVolumeSpecName "kube-api-access-5mhfs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:31:32.599120 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.599095 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-util" (OuterVolumeSpecName: "util") pod "66c1ead7-7fe4-40c5-b775-f89621aa42fa" (UID: "66c1ead7-7fe4-40c5-b775-f89621aa42fa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:31:32.695532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.695443 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-bundle\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:31:32.695532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.695472 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66c1ead7-7fe4-40c5-b775-f89621aa42fa-util\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:31:32.695532 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:32.695483 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5mhfs\" (UniqueName: \"kubernetes.io/projected/66c1ead7-7fe4-40c5-b775-f89621aa42fa-kube-api-access-5mhfs\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:31:33.405451 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:33.405409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" event={"ID":"66c1ead7-7fe4-40c5-b775-f89621aa42fa","Type":"ContainerDied","Data":"36d8324fe2b692afe88aa6ad63e00baf34257eb927d35a1d0e685562c419ae03"} Apr 24 14:31:33.405451 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:33.405451 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36d8324fe2b692afe88aa6ad63e00baf34257eb927d35a1d0e685562c419ae03" Apr 24 14:31:33.405652 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:33.405483 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpvkld" Apr 24 14:31:44.523921 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.523870 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9"] Apr 24 14:31:44.524376 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.524130 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerName="pull" Apr 24 14:31:44.524376 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.524140 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerName="pull" Apr 24 14:31:44.524376 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.524149 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerName="util" Apr 24 14:31:44.524376 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.524155 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerName="util" Apr 24 14:31:44.524376 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.524167 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerName="extract" Apr 24 14:31:44.524376 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.524175 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerName="extract" Apr 24 14:31:44.524376 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.524223 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="66c1ead7-7fe4-40c5-b775-f89621aa42fa" containerName="extract" Apr 24 14:31:44.526813 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.526797 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:44.528850 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.528813 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 14:31:44.529023 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.528863 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 14:31:44.529464 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.529442 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 14:31:44.529568 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.529482 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-c4j4f\"" Apr 24 14:31:44.529568 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.529513 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 14:31:44.529568 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.529524 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 14:31:44.538146 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.538126 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9"] Apr 24 14:31:44.686189 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.686151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x4bv\" (UniqueName: \"kubernetes.io/projected/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-kube-api-access-2x4bv\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:44.686363 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.686223 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:44.686363 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.686301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:44.787512 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.787402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:44.787512 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.787471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x4bv\" (UniqueName: \"kubernetes.io/projected/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-kube-api-access-2x4bv\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:44.787729 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.787520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:44.787729 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:31:44.787627 2574 secret.go:281] references non-existent secret key: tls.crt Apr 24 14:31:44.787729 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:31:44.787643 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 14:31:44.787729 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:31:44.787661 2574 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 14:31:44.787729 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:31:44.787683 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 14:31:44.787998 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:31:44.787759 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-certificates podName:cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c nodeName:}" failed. No retries permitted until 2026-04-24 14:31:45.2877385 +0000 UTC m=+432.726255345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-certificates") pod "keda-metrics-apiserver-7c9f485588-f6kg9" (UID: "cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 14:31:44.787998 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.787789 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:44.807474 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.807442 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x4bv\" (UniqueName: \"kubernetes.io/projected/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-kube-api-access-2x4bv\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:44.815393 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.815356 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-t986v"] Apr 24 14:31:44.819474 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.819449 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:31:44.821561 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.821537 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 14:31:44.827442 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.827418 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-t986v"] Apr 24 14:31:44.888426 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.888388 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a5014df6-dcbf-480e-9980-bcd387e81135-certificates\") pod \"keda-admission-cf49989db-t986v\" (UID: \"a5014df6-dcbf-480e-9980-bcd387e81135\") " pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:31:44.888598 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.888437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxd58\" (UniqueName: \"kubernetes.io/projected/a5014df6-dcbf-480e-9980-bcd387e81135-kube-api-access-cxd58\") pod \"keda-admission-cf49989db-t986v\" (UID: \"a5014df6-dcbf-480e-9980-bcd387e81135\") " pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:31:44.989693 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.989657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a5014df6-dcbf-480e-9980-bcd387e81135-certificates\") pod \"keda-admission-cf49989db-t986v\" (UID: \"a5014df6-dcbf-480e-9980-bcd387e81135\") " pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:31:44.989864 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.989705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxd58\" (UniqueName: \"kubernetes.io/projected/a5014df6-dcbf-480e-9980-bcd387e81135-kube-api-access-cxd58\") pod \"keda-admission-cf49989db-t986v\" (UID: \"a5014df6-dcbf-480e-9980-bcd387e81135\") " pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:31:44.992184 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:44.992161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a5014df6-dcbf-480e-9980-bcd387e81135-certificates\") pod \"keda-admission-cf49989db-t986v\" (UID: \"a5014df6-dcbf-480e-9980-bcd387e81135\") " pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:31:45.000283 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:45.000254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxd58\" (UniqueName: \"kubernetes.io/projected/a5014df6-dcbf-480e-9980-bcd387e81135-kube-api-access-cxd58\") pod \"keda-admission-cf49989db-t986v\" (UID: \"a5014df6-dcbf-480e-9980-bcd387e81135\") " pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:31:45.129668 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:45.129632 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:31:45.250752 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:45.250722 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-t986v"] Apr 24 14:31:45.253814 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:31:45.253782 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5014df6_dcbf_480e_9980_bcd387e81135.slice/crio-ccfe8f8987946c9a073299c8fd4d17ded6cdd32cba11723e488048e700c0a016 WatchSource:0}: Error finding container ccfe8f8987946c9a073299c8fd4d17ded6cdd32cba11723e488048e700c0a016: Status 404 returned error can't find the container with id ccfe8f8987946c9a073299c8fd4d17ded6cdd32cba11723e488048e700c0a016 Apr 24 14:31:45.292770 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:45.292723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:45.295327 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:45.295301 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f6kg9\" (UID: \"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:45.436672 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:45.436574 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:45.438171 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:45.438143 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-t986v" event={"ID":"a5014df6-dcbf-480e-9980-bcd387e81135","Type":"ContainerStarted","Data":"ccfe8f8987946c9a073299c8fd4d17ded6cdd32cba11723e488048e700c0a016"} Apr 24 14:31:45.567593 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:45.567561 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9"] Apr 24 14:31:45.571572 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:31:45.571537 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc75ad02_8fc2_4e3f_b1ca_07e08edc3e1c.slice/crio-ce8fa57acddd293e988bbaf38a5a6cb8e59923a089cc99d2b9bf7921192efe0c WatchSource:0}: Error finding container ce8fa57acddd293e988bbaf38a5a6cb8e59923a089cc99d2b9bf7921192efe0c: Status 404 returned error can't find the container with id ce8fa57acddd293e988bbaf38a5a6cb8e59923a089cc99d2b9bf7921192efe0c Apr 24 14:31:46.442895 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:46.442840 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" event={"ID":"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c","Type":"ContainerStarted","Data":"ce8fa57acddd293e988bbaf38a5a6cb8e59923a089cc99d2b9bf7921192efe0c"} Apr 24 14:31:47.448006 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:47.447967 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-t986v" event={"ID":"a5014df6-dcbf-480e-9980-bcd387e81135","Type":"ContainerStarted","Data":"af86331ee30179bc9f7e4db145ea7fc45838f17eb1fab466d217c8857744cbd4"} Apr 24 14:31:47.448476 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:47.448101 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:31:47.465376 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:47.465318 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-t986v" podStartSLOduration=1.738999632 podStartE2EDuration="3.465303709s" podCreationTimestamp="2026-04-24 14:31:44 +0000 UTC" firstStartedPulling="2026-04-24 14:31:45.255003943 +0000 UTC m=+432.693520781" lastFinishedPulling="2026-04-24 14:31:46.981308014 +0000 UTC m=+434.419824858" observedRunningTime="2026-04-24 14:31:47.464054467 +0000 UTC m=+434.902571328" watchObservedRunningTime="2026-04-24 14:31:47.465303709 +0000 UTC m=+434.903820563" Apr 24 14:31:48.452298 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:48.452250 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" event={"ID":"cc75ad02-8fc2-4e3f-b1ca-07e08edc3e1c","Type":"ContainerStarted","Data":"1dddde595025531f3cebf4486b899b0614e29b54e59d31893aa6cd256691bb1b"} Apr 24 14:31:48.452298 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:48.452302 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:31:48.470726 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:48.470675 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" podStartSLOduration=2.129248794 podStartE2EDuration="4.470661219s" podCreationTimestamp="2026-04-24 14:31:44 +0000 UTC" firstStartedPulling="2026-04-24 14:31:45.572894182 +0000 UTC m=+433.011411034" lastFinishedPulling="2026-04-24 14:31:47.914306609 +0000 UTC m=+435.352823459" observedRunningTime="2026-04-24 14:31:48.46893701 +0000 UTC m=+435.907453871" watchObservedRunningTime="2026-04-24 14:31:48.470661219 +0000 UTC m=+435.909178080" Apr 24 14:31:59.459729 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:31:59.459693 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f6kg9" Apr 24 14:32:08.455158 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:08.455124 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-t986v" Apr 24 14:32:51.256316 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.256283 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-d9zql"] Apr 24 14:32:51.259746 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.259724 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:32:51.262510 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.262477 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-fp6cb\"" Apr 24 14:32:51.262510 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.262480 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 14:32:51.262673 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.262477 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 14:32:51.262673 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.262478 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 14:32:51.271486 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.271463 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-d9zql"] Apr 24 14:32:51.295895 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.295857 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-t7lp5"] Apr 24 14:32:51.298065 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.298049 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:32:51.300609 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.300589 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 14:32:51.300706 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.300594 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-kcg4b\"" Apr 24 14:32:51.309936 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.309911 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-t7lp5"] Apr 24 14:32:51.366548 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.366510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-cert\") pod \"kserve-controller-manager-b7dc77d59-d9zql\" (UID: \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\") " pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:32:51.366718 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.366586 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8px6\" (UniqueName: \"kubernetes.io/projected/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-kube-api-access-h8px6\") pod \"kserve-controller-manager-b7dc77d59-d9zql\" (UID: \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\") " pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:32:51.366718 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.366607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s46m5\" (UniqueName: \"kubernetes.io/projected/a5ec2e7d-4ab3-4b53-95f2-95e438354d70-kube-api-access-s46m5\") pod \"seaweedfs-86cc847c5c-t7lp5\" (UID: \"a5ec2e7d-4ab3-4b53-95f2-95e438354d70\") " pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:32:51.366718 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.366627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a5ec2e7d-4ab3-4b53-95f2-95e438354d70-data\") pod \"seaweedfs-86cc847c5c-t7lp5\" (UID: \"a5ec2e7d-4ab3-4b53-95f2-95e438354d70\") " pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:32:51.467659 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.467630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8px6\" (UniqueName: \"kubernetes.io/projected/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-kube-api-access-h8px6\") pod \"kserve-controller-manager-b7dc77d59-d9zql\" (UID: \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\") " pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:32:51.467659 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.467663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s46m5\" (UniqueName: \"kubernetes.io/projected/a5ec2e7d-4ab3-4b53-95f2-95e438354d70-kube-api-access-s46m5\") pod \"seaweedfs-86cc847c5c-t7lp5\" (UID: \"a5ec2e7d-4ab3-4b53-95f2-95e438354d70\") " pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:32:51.467859 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.467685 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a5ec2e7d-4ab3-4b53-95f2-95e438354d70-data\") pod \"seaweedfs-86cc847c5c-t7lp5\" (UID: \"a5ec2e7d-4ab3-4b53-95f2-95e438354d70\") " pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:32:51.467941 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.467917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-cert\") pod \"kserve-controller-manager-b7dc77d59-d9zql\" (UID: \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\") " pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:32:51.468018 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.468004 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a5ec2e7d-4ab3-4b53-95f2-95e438354d70-data\") pod \"seaweedfs-86cc847c5c-t7lp5\" (UID: \"a5ec2e7d-4ab3-4b53-95f2-95e438354d70\") " pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:32:51.470165 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.470141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-cert\") pod \"kserve-controller-manager-b7dc77d59-d9zql\" (UID: \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\") " pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:32:51.476977 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.476950 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s46m5\" (UniqueName: \"kubernetes.io/projected/a5ec2e7d-4ab3-4b53-95f2-95e438354d70-kube-api-access-s46m5\") pod \"seaweedfs-86cc847c5c-t7lp5\" (UID: \"a5ec2e7d-4ab3-4b53-95f2-95e438354d70\") " pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:32:51.477462 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.477441 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8px6\" (UniqueName: \"kubernetes.io/projected/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-kube-api-access-h8px6\") pod \"kserve-controller-manager-b7dc77d59-d9zql\" (UID: \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\") " pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:32:51.569473 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.569438 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:32:51.607281 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.607243 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:32:51.703798 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.703632 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-d9zql"] Apr 24 14:32:51.705954 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:32:51.705922 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57cb6aff_815f_4ab5_8936_6b4977f3c8bf.slice/crio-0d40490859b9ee0ca4a51a20a9515cccf1b5dcf0033c8fb61319edfead335e6f WatchSource:0}: Error finding container 0d40490859b9ee0ca4a51a20a9515cccf1b5dcf0033c8fb61319edfead335e6f: Status 404 returned error can't find the container with id 0d40490859b9ee0ca4a51a20a9515cccf1b5dcf0033c8fb61319edfead335e6f Apr 24 14:32:51.740413 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:51.740385 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-t7lp5"] Apr 24 14:32:51.742805 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:32:51.742776 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5ec2e7d_4ab3_4b53_95f2_95e438354d70.slice/crio-7b96ce62e175ef11cc2158cc689704ba3b79a25270b2c3d2af70503cd63c35c4 WatchSource:0}: Error finding container 7b96ce62e175ef11cc2158cc689704ba3b79a25270b2c3d2af70503cd63c35c4: Status 404 returned error can't find the container with id 7b96ce62e175ef11cc2158cc689704ba3b79a25270b2c3d2af70503cd63c35c4 Apr 24 14:32:52.632009 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:52.631970 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-t7lp5" event={"ID":"a5ec2e7d-4ab3-4b53-95f2-95e438354d70","Type":"ContainerStarted","Data":"7b96ce62e175ef11cc2158cc689704ba3b79a25270b2c3d2af70503cd63c35c4"} Apr 24 14:32:52.632914 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:52.632845 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" event={"ID":"57cb6aff-815f-4ab5-8936-6b4977f3c8bf","Type":"ContainerStarted","Data":"0d40490859b9ee0ca4a51a20a9515cccf1b5dcf0033c8fb61319edfead335e6f"} Apr 24 14:32:55.642462 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:55.642430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-t7lp5" event={"ID":"a5ec2e7d-4ab3-4b53-95f2-95e438354d70","Type":"ContainerStarted","Data":"156cf7d49cc6691dcbae4325c0bb9cf680b6d79dd79f2efdb8baefcd03f16092"} Apr 24 14:32:55.642932 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:55.642519 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:32:55.643698 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:55.643677 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" event={"ID":"57cb6aff-815f-4ab5-8936-6b4977f3c8bf","Type":"ContainerStarted","Data":"2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627"} Apr 24 14:32:55.643808 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:55.643789 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:32:55.658959 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:55.658911 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-t7lp5" podStartSLOduration=1.121458275 podStartE2EDuration="4.658876247s" podCreationTimestamp="2026-04-24 14:32:51 +0000 UTC" firstStartedPulling="2026-04-24 14:32:51.744069073 +0000 UTC m=+499.182585911" lastFinishedPulling="2026-04-24 14:32:55.281487041 +0000 UTC m=+502.720003883" observedRunningTime="2026-04-24 14:32:55.658084169 +0000 UTC m=+503.096601027" watchObservedRunningTime="2026-04-24 14:32:55.658876247 +0000 UTC m=+503.097393109" Apr 24 14:32:55.675668 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:32:55.675609 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" podStartSLOduration=1.171545682 podStartE2EDuration="4.675594713s" podCreationTimestamp="2026-04-24 14:32:51 +0000 UTC" firstStartedPulling="2026-04-24 14:32:51.7074087 +0000 UTC m=+499.145925538" lastFinishedPulling="2026-04-24 14:32:55.211457731 +0000 UTC m=+502.649974569" observedRunningTime="2026-04-24 14:32:55.674823106 +0000 UTC m=+503.113339969" watchObservedRunningTime="2026-04-24 14:32:55.675594713 +0000 UTC m=+503.114111574" Apr 24 14:33:01.649069 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:01.649034 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-t7lp5" Apr 24 14:33:26.351108 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.351024 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-d9zql"] Apr 24 14:33:26.351604 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.351343 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" podUID="57cb6aff-815f-4ab5-8936-6b4977f3c8bf" containerName="manager" containerID="cri-o://2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627" gracePeriod=10 Apr 24 14:33:26.356088 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.356063 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:33:26.373657 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.373634 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-qp9rk"] Apr 24 14:33:26.375687 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.375595 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:26.386634 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.386611 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-qp9rk"] Apr 24 14:33:26.452186 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.452157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmbmd\" (UniqueName: \"kubernetes.io/projected/4c49537d-7a5d-4934-9cc6-7f1853e0a401-kube-api-access-pmbmd\") pod \"kserve-controller-manager-b7dc77d59-qp9rk\" (UID: \"4c49537d-7a5d-4934-9cc6-7f1853e0a401\") " pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:26.452321 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.452231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c49537d-7a5d-4934-9cc6-7f1853e0a401-cert\") pod \"kserve-controller-manager-b7dc77d59-qp9rk\" (UID: \"4c49537d-7a5d-4934-9cc6-7f1853e0a401\") " pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:26.553617 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.553575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmbmd\" (UniqueName: \"kubernetes.io/projected/4c49537d-7a5d-4934-9cc6-7f1853e0a401-kube-api-access-pmbmd\") pod \"kserve-controller-manager-b7dc77d59-qp9rk\" (UID: \"4c49537d-7a5d-4934-9cc6-7f1853e0a401\") " pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:26.553779 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.553679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c49537d-7a5d-4934-9cc6-7f1853e0a401-cert\") pod \"kserve-controller-manager-b7dc77d59-qp9rk\" (UID: \"4c49537d-7a5d-4934-9cc6-7f1853e0a401\") " pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:26.556192 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.556163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c49537d-7a5d-4934-9cc6-7f1853e0a401-cert\") pod \"kserve-controller-manager-b7dc77d59-qp9rk\" (UID: \"4c49537d-7a5d-4934-9cc6-7f1853e0a401\") " pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:26.562933 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.562876 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmbmd\" (UniqueName: \"kubernetes.io/projected/4c49537d-7a5d-4934-9cc6-7f1853e0a401-kube-api-access-pmbmd\") pod \"kserve-controller-manager-b7dc77d59-qp9rk\" (UID: \"4c49537d-7a5d-4934-9cc6-7f1853e0a401\") " pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:26.585706 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.585683 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:33:26.654313 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.654232 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-cert\") pod \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\" (UID: \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\") " Apr 24 14:33:26.654483 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.654331 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8px6\" (UniqueName: \"kubernetes.io/projected/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-kube-api-access-h8px6\") pod \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\" (UID: \"57cb6aff-815f-4ab5-8936-6b4977f3c8bf\") " Apr 24 14:33:26.656416 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.656388 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-kube-api-access-h8px6" (OuterVolumeSpecName: "kube-api-access-h8px6") pod "57cb6aff-815f-4ab5-8936-6b4977f3c8bf" (UID: "57cb6aff-815f-4ab5-8936-6b4977f3c8bf"). InnerVolumeSpecName "kube-api-access-h8px6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:33:26.656517 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.656430 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-cert" (OuterVolumeSpecName: "cert") pod "57cb6aff-815f-4ab5-8936-6b4977f3c8bf" (UID: "57cb6aff-815f-4ab5-8936-6b4977f3c8bf"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:33:26.738433 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.738402 2574 generic.go:358] "Generic (PLEG): container finished" podID="57cb6aff-815f-4ab5-8936-6b4977f3c8bf" containerID="2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627" exitCode=0 Apr 24 14:33:26.738628 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.738451 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" event={"ID":"57cb6aff-815f-4ab5-8936-6b4977f3c8bf","Type":"ContainerDied","Data":"2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627"} Apr 24 14:33:26.738628 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.738474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" event={"ID":"57cb6aff-815f-4ab5-8936-6b4977f3c8bf","Type":"ContainerDied","Data":"0d40490859b9ee0ca4a51a20a9515cccf1b5dcf0033c8fb61319edfead335e6f"} Apr 24 14:33:26.738628 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.738473 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-d9zql" Apr 24 14:33:26.738628 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.738552 2574 scope.go:117] "RemoveContainer" containerID="2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627" Apr 24 14:33:26.739216 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.739195 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:26.746912 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.746867 2574 scope.go:117] "RemoveContainer" containerID="2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627" Apr 24 14:33:26.747206 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:33:26.747148 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627\": container with ID starting with 2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627 not found: ID does not exist" containerID="2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627" Apr 24 14:33:26.747206 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.747176 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627"} err="failed to get container status \"2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627\": rpc error: code = NotFound desc = could not find container \"2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627\": container with ID starting with 2ea5275eb2a9064be9b5a9d9eb0ae23b57372843767411feb60b8cd5c5792627 not found: ID does not exist" Apr 24 14:33:26.755408 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.755386 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8px6\" (UniqueName: \"kubernetes.io/projected/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-kube-api-access-h8px6\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:33:26.755488 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.755413 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57cb6aff-815f-4ab5-8936-6b4977f3c8bf-cert\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:33:26.761299 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.761273 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-d9zql"] Apr 24 14:33:26.765997 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.765978 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-d9zql"] Apr 24 14:33:26.864350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:26.864324 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-qp9rk"] Apr 24 14:33:26.866829 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:33:26.866801 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c49537d_7a5d_4934_9cc6_7f1853e0a401.slice/crio-d49f08d1150edb2b32001913da0030995b439f85284279de5522bb6ea93331af WatchSource:0}: Error finding container d49f08d1150edb2b32001913da0030995b439f85284279de5522bb6ea93331af: Status 404 returned error can't find the container with id d49f08d1150edb2b32001913da0030995b439f85284279de5522bb6ea93331af Apr 24 14:33:27.170225 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:27.170191 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57cb6aff-815f-4ab5-8936-6b4977f3c8bf" path="/var/lib/kubelet/pods/57cb6aff-815f-4ab5-8936-6b4977f3c8bf/volumes" Apr 24 14:33:27.743303 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:27.743261 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" event={"ID":"4c49537d-7a5d-4934-9cc6-7f1853e0a401","Type":"ContainerStarted","Data":"2ec9a4475960c6074c83d36acbd47902dedc5889724866c633c041d19b8e356f"} Apr 24 14:33:27.743756 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:27.743388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" event={"ID":"4c49537d-7a5d-4934-9cc6-7f1853e0a401","Type":"ContainerStarted","Data":"d49f08d1150edb2b32001913da0030995b439f85284279de5522bb6ea93331af"} Apr 24 14:33:27.743756 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:27.743515 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:27.759512 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:27.759458 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" podStartSLOduration=1.426164859 podStartE2EDuration="1.75944002s" podCreationTimestamp="2026-04-24 14:33:26 +0000 UTC" firstStartedPulling="2026-04-24 14:33:26.86804765 +0000 UTC m=+534.306564491" lastFinishedPulling="2026-04-24 14:33:27.201322799 +0000 UTC m=+534.639839652" observedRunningTime="2026-04-24 14:33:27.758469526 +0000 UTC m=+535.196986385" watchObservedRunningTime="2026-04-24 14:33:27.75944002 +0000 UTC m=+535.197956880" Apr 24 14:33:48.793310 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.793267 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dbd598d65-vqcxc"] Apr 24 14:33:48.793725 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.793565 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57cb6aff-815f-4ab5-8936-6b4977f3c8bf" containerName="manager" Apr 24 14:33:48.793725 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.793577 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="57cb6aff-815f-4ab5-8936-6b4977f3c8bf" containerName="manager" Apr 24 14:33:48.793725 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.793631 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="57cb6aff-815f-4ab5-8936-6b4977f3c8bf" containerName="manager" Apr 24 14:33:48.796451 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.796434 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.798855 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.798832 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 14:33:48.798972 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.798874 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 14:33:48.799432 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.799419 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gw62q\"" Apr 24 14:33:48.799740 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.799723 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 14:33:48.799815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.799720 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 14:33:48.799815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.799766 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 14:33:48.800234 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.800219 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 14:33:48.800280 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.800239 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 14:33:48.809129 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.809090 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 14:33:48.812352 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.812328 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dbd598d65-vqcxc"] Apr 24 14:33:48.826220 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.826191 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-oauth-serving-cert\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.826383 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.826243 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-trusted-ca-bundle\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.826383 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.826273 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbhzn\" (UniqueName: \"kubernetes.io/projected/4747dcd2-3b22-48a3-8276-3b1b093a007e-kube-api-access-nbhzn\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.826383 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.826329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4747dcd2-3b22-48a3-8276-3b1b093a007e-console-serving-cert\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.826383 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.826345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4747dcd2-3b22-48a3-8276-3b1b093a007e-console-oauth-config\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.826383 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.826374 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-service-ca\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.826620 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.826407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-console-config\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.927363 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.927324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-service-ca\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.927521 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.927375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-console-config\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.927521 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.927404 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-oauth-serving-cert\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.927521 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.927435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-trusted-ca-bundle\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.927521 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.927451 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbhzn\" (UniqueName: \"kubernetes.io/projected/4747dcd2-3b22-48a3-8276-3b1b093a007e-kube-api-access-nbhzn\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.927521 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.927473 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4747dcd2-3b22-48a3-8276-3b1b093a007e-console-serving-cert\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.927784 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.927655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4747dcd2-3b22-48a3-8276-3b1b093a007e-console-oauth-config\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.928217 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.928181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-oauth-serving-cert\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.928325 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.928181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-service-ca\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.928325 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.928239 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-console-config\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.928406 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.928361 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4747dcd2-3b22-48a3-8276-3b1b093a007e-trusted-ca-bundle\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.930474 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.930450 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4747dcd2-3b22-48a3-8276-3b1b093a007e-console-serving-cert\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.930474 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.930465 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4747dcd2-3b22-48a3-8276-3b1b093a007e-console-oauth-config\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:48.937002 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:48.936974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbhzn\" (UniqueName: \"kubernetes.io/projected/4747dcd2-3b22-48a3-8276-3b1b093a007e-kube-api-access-nbhzn\") pod \"console-6dbd598d65-vqcxc\" (UID: \"4747dcd2-3b22-48a3-8276-3b1b093a007e\") " pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:49.112851 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:49.112817 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:49.235012 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:49.234974 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dbd598d65-vqcxc"] Apr 24 14:33:49.238992 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:33:49.238951 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4747dcd2_3b22_48a3_8276_3b1b093a007e.slice/crio-1abed6cfd42c7fc99fbbb228609e81cdd1b9516fb60d1c2fe686281469d967d4 WatchSource:0}: Error finding container 1abed6cfd42c7fc99fbbb228609e81cdd1b9516fb60d1c2fe686281469d967d4: Status 404 returned error can't find the container with id 1abed6cfd42c7fc99fbbb228609e81cdd1b9516fb60d1c2fe686281469d967d4 Apr 24 14:33:49.811290 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:49.811254 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbd598d65-vqcxc" event={"ID":"4747dcd2-3b22-48a3-8276-3b1b093a007e","Type":"ContainerStarted","Data":"a5b1bc7fd9b0cc96bf57ce2c44032f50e962bff2b8f29152ee545a146c05240e"} Apr 24 14:33:49.811290 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:49.811291 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbd598d65-vqcxc" event={"ID":"4747dcd2-3b22-48a3-8276-3b1b093a007e","Type":"ContainerStarted","Data":"1abed6cfd42c7fc99fbbb228609e81cdd1b9516fb60d1c2fe686281469d967d4"} Apr 24 14:33:49.834661 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:49.834607 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dbd598d65-vqcxc" podStartSLOduration=1.8345912389999999 podStartE2EDuration="1.834591239s" podCreationTimestamp="2026-04-24 14:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:33:49.833191731 +0000 UTC m=+557.271708593" watchObservedRunningTime="2026-04-24 14:33:49.834591239 +0000 UTC m=+557.273108098" Apr 24 14:33:58.752601 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:58.752564 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b7dc77d59-qp9rk" Apr 24 14:33:59.113140 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:59.113103 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:59.113344 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:59.113262 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:59.118183 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:59.118157 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:33:59.846900 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:33:59.846850 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dbd598d65-vqcxc" Apr 24 14:34:38.476224 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.476181 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7"] Apr 24 14:34:38.479646 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.479624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:34:38.481668 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.481646 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qcgw2\"" Apr 24 14:34:38.488472 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.488443 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7"] Apr 24 14:34:38.639842 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.639804 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8573857b-afac-4a29-90a2-a368527c028b-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7\" (UID: \"8573857b-afac-4a29-90a2-a368527c028b\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:34:38.741277 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.741181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8573857b-afac-4a29-90a2-a368527c028b-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7\" (UID: \"8573857b-afac-4a29-90a2-a368527c028b\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:34:38.741514 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.741497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8573857b-afac-4a29-90a2-a368527c028b-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7\" (UID: \"8573857b-afac-4a29-90a2-a368527c028b\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:34:38.790871 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.790830 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:34:38.909415 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.909252 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7"] Apr 24 14:34:38.912286 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:34:38.912256 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8573857b_afac_4a29_90a2_a368527c028b.slice/crio-efa14c1fd0170edb9d2a2fb77a4d4913c1bf45461866fdfab5e54d33b394af00 WatchSource:0}: Error finding container efa14c1fd0170edb9d2a2fb77a4d4913c1bf45461866fdfab5e54d33b394af00: Status 404 returned error can't find the container with id efa14c1fd0170edb9d2a2fb77a4d4913c1bf45461866fdfab5e54d33b394af00 Apr 24 14:34:38.954519 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:38.954478 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" event={"ID":"8573857b-afac-4a29-90a2-a368527c028b","Type":"ContainerStarted","Data":"efa14c1fd0170edb9d2a2fb77a4d4913c1bf45461866fdfab5e54d33b394af00"} Apr 24 14:34:42.966626 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:42.966583 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" event={"ID":"8573857b-afac-4a29-90a2-a368527c028b","Type":"ContainerStarted","Data":"adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282"} Apr 24 14:34:46.979414 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:46.979379 2574 generic.go:358] "Generic (PLEG): container finished" podID="8573857b-afac-4a29-90a2-a368527c028b" containerID="adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282" exitCode=0 Apr 24 14:34:46.979821 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:34:46.979441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" event={"ID":"8573857b-afac-4a29-90a2-a368527c028b","Type":"ContainerDied","Data":"adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282"} Apr 24 14:35:01.029050 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:01.029009 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" event={"ID":"8573857b-afac-4a29-90a2-a368527c028b","Type":"ContainerStarted","Data":"d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d"} Apr 24 14:35:03.036686 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:03.036649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" event={"ID":"8573857b-afac-4a29-90a2-a368527c028b","Type":"ContainerStarted","Data":"292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c"} Apr 24 14:35:03.037131 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:03.037098 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:35:03.038573 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:03.038536 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:35:03.060023 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:03.059962 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podStartSLOduration=1.07984781 podStartE2EDuration="25.059944945s" podCreationTimestamp="2026-04-24 14:34:38 +0000 UTC" firstStartedPulling="2026-04-24 14:34:38.914255968 +0000 UTC m=+606.352772806" lastFinishedPulling="2026-04-24 14:35:02.894353104 +0000 UTC m=+630.332869941" observedRunningTime="2026-04-24 14:35:03.057866268 +0000 UTC m=+630.496383140" watchObservedRunningTime="2026-04-24 14:35:03.059944945 +0000 UTC m=+630.498461810" Apr 24 14:35:04.039585 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:04.039540 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:35:04.040047 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:04.039649 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:35:04.040573 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:04.040552 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:05.042648 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:05.042606 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:35:05.043110 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:05.042930 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:15.043629 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:15.043576 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:35:15.044118 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:15.044078 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:25.042632 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:25.042582 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:35:25.043114 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:25.043079 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:35.043449 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:35.043394 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:35:35.043943 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:35.043917 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:45.043552 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:45.043496 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:35:45.044014 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:45.043927 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:55.043333 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:55.043284 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:35:55.043859 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:35:55.043833 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:05.042821 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:05.042767 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:36:05.043347 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:05.043252 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:08.166865 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:08.166830 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:36:08.167268 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:08.167010 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:36:23.655049 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.655009 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7"] Apr 24 14:36:23.655480 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.655318 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" containerID="cri-o://d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d" gracePeriod=30 Apr 24 14:36:23.655480 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.655401 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" containerID="cri-o://292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c" gracePeriod=30 Apr 24 14:36:23.816028 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.815997 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz"] Apr 24 14:36:23.819212 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.819195 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" Apr 24 14:36:23.830330 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.830302 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz"] Apr 24 14:36:23.850976 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.850947 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px"] Apr 24 14:36:23.854139 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.854120 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" Apr 24 14:36:23.863075 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.863052 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px"] Apr 24 14:36:23.900205 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:23.900170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0829893-cbda-4b55-88b3-077ae1ea9f09-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz\" (UID: \"b0829893-cbda-4b55-88b3-077ae1ea9f09\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" Apr 24 14:36:24.000685 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.000595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0829893-cbda-4b55-88b3-077ae1ea9f09-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz\" (UID: \"b0829893-cbda-4b55-88b3-077ae1ea9f09\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" Apr 24 14:36:24.000685 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.000664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0388cf9-12c9-4ee2-ac85-53ef90abe77c-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px\" (UID: \"c0388cf9-12c9-4ee2-ac85-53ef90abe77c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" Apr 24 14:36:24.000997 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.000977 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0829893-cbda-4b55-88b3-077ae1ea9f09-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz\" (UID: \"b0829893-cbda-4b55-88b3-077ae1ea9f09\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" Apr 24 14:36:24.102032 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.101993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0388cf9-12c9-4ee2-ac85-53ef90abe77c-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px\" (UID: \"c0388cf9-12c9-4ee2-ac85-53ef90abe77c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" Apr 24 14:36:24.102379 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.102346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0388cf9-12c9-4ee2-ac85-53ef90abe77c-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px\" (UID: \"c0388cf9-12c9-4ee2-ac85-53ef90abe77c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" Apr 24 14:36:24.129965 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.129917 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" Apr 24 14:36:24.165023 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.164988 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" Apr 24 14:36:24.259742 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.259709 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz"] Apr 24 14:36:24.260281 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:36:24.260247 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0829893_cbda_4b55_88b3_077ae1ea9f09.slice/crio-639cfa9e75b0eeefc4f45b64986dc1a5c5399794470bacd70d4580597d732505 WatchSource:0}: Error finding container 639cfa9e75b0eeefc4f45b64986dc1a5c5399794470bacd70d4580597d732505: Status 404 returned error can't find the container with id 639cfa9e75b0eeefc4f45b64986dc1a5c5399794470bacd70d4580597d732505 Apr 24 14:36:24.262362 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.262339 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:36:24.270043 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.269608 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" event={"ID":"b0829893-cbda-4b55-88b3-077ae1ea9f09","Type":"ContainerStarted","Data":"639cfa9e75b0eeefc4f45b64986dc1a5c5399794470bacd70d4580597d732505"} Apr 24 14:36:24.298344 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:24.298316 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px"] Apr 24 14:36:24.301491 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:36:24.301458 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0388cf9_12c9_4ee2_ac85_53ef90abe77c.slice/crio-4ee0702c1ee312c7b32a714a5985c160f2180b270efe3489d7219aa06445e6ca WatchSource:0}: Error finding container 4ee0702c1ee312c7b32a714a5985c160f2180b270efe3489d7219aa06445e6ca: Status 404 returned error can't find the container with id 4ee0702c1ee312c7b32a714a5985c160f2180b270efe3489d7219aa06445e6ca Apr 24 14:36:25.273665 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:25.273626 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" event={"ID":"b0829893-cbda-4b55-88b3-077ae1ea9f09","Type":"ContainerStarted","Data":"8c2c1be4074007379e4d16adb06e2693afe992cec33f6d77bfabaf5a3f3b5446"} Apr 24 14:36:25.274929 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:25.274901 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" event={"ID":"c0388cf9-12c9-4ee2-ac85-53ef90abe77c","Type":"ContainerStarted","Data":"66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a"} Apr 24 14:36:25.274929 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:25.274933 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" event={"ID":"c0388cf9-12c9-4ee2-ac85-53ef90abe77c","Type":"ContainerStarted","Data":"4ee0702c1ee312c7b32a714a5985c160f2180b270efe3489d7219aa06445e6ca"} Apr 24 14:36:28.166437 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:28.166389 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:36:28.166864 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:28.166725 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:28.284392 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:28.284307 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerID="8c2c1be4074007379e4d16adb06e2693afe992cec33f6d77bfabaf5a3f3b5446" exitCode=0 Apr 24 14:36:28.284577 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:28.284380 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" event={"ID":"b0829893-cbda-4b55-88b3-077ae1ea9f09","Type":"ContainerDied","Data":"8c2c1be4074007379e4d16adb06e2693afe992cec33f6d77bfabaf5a3f3b5446"} Apr 24 14:36:28.285845 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:28.285814 2574 generic.go:358] "Generic (PLEG): container finished" podID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerID="66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a" exitCode=0 Apr 24 14:36:28.285946 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:28.285903 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" event={"ID":"c0388cf9-12c9-4ee2-ac85-53ef90abe77c","Type":"ContainerDied","Data":"66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a"} Apr 24 14:36:28.287910 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:28.287863 2574 generic.go:358] "Generic (PLEG): container finished" podID="8573857b-afac-4a29-90a2-a368527c028b" containerID="d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d" exitCode=0 Apr 24 14:36:28.287980 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:28.287909 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" event={"ID":"8573857b-afac-4a29-90a2-a368527c028b","Type":"ContainerDied","Data":"d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d"} Apr 24 14:36:29.293520 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:29.293477 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" event={"ID":"b0829893-cbda-4b55-88b3-077ae1ea9f09","Type":"ContainerStarted","Data":"d45c18ddf162ad98c93a4807d86f81a95845d77978aa42e72243bd1fdf7f6d4c"} Apr 24 14:36:29.293994 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:29.293805 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" Apr 24 14:36:29.295061 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:29.295029 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 14:36:29.311516 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:29.311458 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podStartSLOduration=6.311437926 podStartE2EDuration="6.311437926s" podCreationTimestamp="2026-04-24 14:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:36:29.30995713 +0000 UTC m=+716.748474190" watchObservedRunningTime="2026-04-24 14:36:29.311437926 +0000 UTC m=+716.749954789" Apr 24 14:36:30.297552 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:30.297507 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 14:36:38.167173 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:38.167108 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:36:38.167566 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:38.167446 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:40.298520 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:40.298468 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 14:36:48.166496 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:48.166446 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 14:36:48.166995 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:48.166593 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:36:48.166995 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:48.166813 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:48.166995 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:48.166950 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:36:48.361840 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:48.361801 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" event={"ID":"c0388cf9-12c9-4ee2-ac85-53ef90abe77c","Type":"ContainerStarted","Data":"420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96"} Apr 24 14:36:48.362127 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:48.362108 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" Apr 24 14:36:48.363242 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:48.363215 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 14:36:48.378492 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:48.378431 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" podStartSLOduration=6.048901544 podStartE2EDuration="25.378414679s" podCreationTimestamp="2026-04-24 14:36:23 +0000 UTC" firstStartedPulling="2026-04-24 14:36:28.287112752 +0000 UTC m=+715.725629593" lastFinishedPulling="2026-04-24 14:36:47.616625876 +0000 UTC m=+735.055142728" observedRunningTime="2026-04-24 14:36:48.377552774 +0000 UTC m=+735.816069635" watchObservedRunningTime="2026-04-24 14:36:48.378414679 +0000 UTC m=+735.816931539" Apr 24 14:36:49.365571 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:49.365534 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 14:36:50.298592 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:50.298546 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 14:36:54.314171 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.314145 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:36:54.365239 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.365205 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8573857b-afac-4a29-90a2-a368527c028b-kserve-provision-location\") pod \"8573857b-afac-4a29-90a2-a368527c028b\" (UID: \"8573857b-afac-4a29-90a2-a368527c028b\") " Apr 24 14:36:54.365520 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.365498 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8573857b-afac-4a29-90a2-a368527c028b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8573857b-afac-4a29-90a2-a368527c028b" (UID: "8573857b-afac-4a29-90a2-a368527c028b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:36:54.380245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.380212 2574 generic.go:358] "Generic (PLEG): container finished" podID="8573857b-afac-4a29-90a2-a368527c028b" containerID="292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c" exitCode=0 Apr 24 14:36:54.380368 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.380261 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" event={"ID":"8573857b-afac-4a29-90a2-a368527c028b","Type":"ContainerDied","Data":"292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c"} Apr 24 14:36:54.380368 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.380296 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" Apr 24 14:36:54.380368 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.380305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7" event={"ID":"8573857b-afac-4a29-90a2-a368527c028b","Type":"ContainerDied","Data":"efa14c1fd0170edb9d2a2fb77a4d4913c1bf45461866fdfab5e54d33b394af00"} Apr 24 14:36:54.380368 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.380328 2574 scope.go:117] "RemoveContainer" containerID="292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c" Apr 24 14:36:54.388329 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.388313 2574 scope.go:117] "RemoveContainer" containerID="d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d" Apr 24 14:36:54.395612 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.395594 2574 scope.go:117] "RemoveContainer" containerID="adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282" Apr 24 14:36:54.402709 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.402689 2574 scope.go:117] "RemoveContainer" containerID="292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c" Apr 24 14:36:54.402983 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:36:54.402952 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c\": container with ID starting with 292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c not found: ID does not exist" containerID="292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c" Apr 24 14:36:54.403046 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.402981 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c"} err="failed to get container status \"292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c\": rpc error: code = NotFound desc = could not find container \"292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c\": container with ID starting with 292350e8d6888d4ef0a62373600036d6cc876969616adc6f6b57f213f91f059c not found: ID does not exist" Apr 24 14:36:54.403046 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.403003 2574 scope.go:117] "RemoveContainer" containerID="d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d" Apr 24 14:36:54.403266 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:36:54.403242 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d\": container with ID starting with d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d not found: ID does not exist" containerID="d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d" Apr 24 14:36:54.403351 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.403276 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d"} err="failed to get container status \"d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d\": rpc error: code = NotFound desc = could not find container \"d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d\": container with ID starting with d843818ff43eb6f5bd1b0d1dc7ecd4218899ac54ca3fb996a891d5f1a2ea902d not found: ID does not exist" Apr 24 14:36:54.403351 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.403300 2574 scope.go:117] "RemoveContainer" containerID="adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282" Apr 24 14:36:54.403555 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:36:54.403538 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282\": container with ID starting with adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282 not found: ID does not exist" containerID="adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282" Apr 24 14:36:54.403620 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.403560 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282"} err="failed to get container status \"adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282\": rpc error: code = NotFound desc = could not find container \"adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282\": container with ID starting with adfe41e6f74c1b98c14913c985c6655e3a7c225a92ce8fd02251ad97e4ace282 not found: ID does not exist" Apr 24 14:36:54.404039 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.404020 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7"] Apr 24 14:36:54.409506 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.409486 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-26556-predictor-54fd76b6d5-c2xf7"] Apr 24 14:36:54.465957 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:54.465923 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8573857b-afac-4a29-90a2-a368527c028b-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:36:55.170355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:55.170324 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8573857b-afac-4a29-90a2-a368527c028b" path="/var/lib/kubelet/pods/8573857b-afac-4a29-90a2-a368527c028b/volumes" Apr 24 14:36:59.366298 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:36:59.366248 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 14:37:00.297622 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:00.297572 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 14:37:09.365622 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:09.365570 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 14:37:10.297797 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:10.297743 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 14:37:19.366124 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:19.366081 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 14:37:20.297802 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:20.297756 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 14:37:29.365641 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:29.365593 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 14:37:30.297637 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:30.297595 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 14:37:39.366230 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:39.366187 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 14:37:40.298771 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:40.298739 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" Apr 24 14:37:49.367090 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:37:49.367053 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" Apr 24 14:38:04.024859 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.024822 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz"] Apr 24 14:38:04.025312 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.025134 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" containerID="cri-o://d45c18ddf162ad98c93a4807d86f81a95845d77978aa42e72243bd1fdf7f6d4c" gracePeriod=30 Apr 24 14:38:04.064548 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.064497 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8"] Apr 24 14:38:04.064920 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.064901 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="storage-initializer" Apr 24 14:38:04.064920 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.064921 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="storage-initializer" Apr 24 14:38:04.065072 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.064957 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" Apr 24 14:38:04.065072 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.064966 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" Apr 24 14:38:04.065072 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.064987 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" Apr 24 14:38:04.065072 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.064996 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" Apr 24 14:38:04.065257 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.065079 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="agent" Apr 24 14:38:04.065257 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.065093 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8573857b-afac-4a29-90a2-a368527c028b" containerName="kserve-container" Apr 24 14:38:04.068189 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.068168 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" Apr 24 14:38:04.074955 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.074931 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8"] Apr 24 14:38:04.108245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.108208 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8\" (UID: \"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" Apr 24 14:38:04.121675 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.121636 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t"] Apr 24 14:38:04.124784 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.124765 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" Apr 24 14:38:04.135959 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.135936 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t"] Apr 24 14:38:04.209215 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.209178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5ada30a-c8f3-421f-b2a7-e901ed97b1d7-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t\" (UID: \"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" Apr 24 14:38:04.209378 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.209291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8\" (UID: \"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" Apr 24 14:38:04.209642 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.209620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8\" (UID: \"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" Apr 24 14:38:04.235915 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.235862 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px"] Apr 24 14:38:04.236175 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.236151 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" containerID="cri-o://420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96" gracePeriod=30 Apr 24 14:38:04.310233 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.310202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5ada30a-c8f3-421f-b2a7-e901ed97b1d7-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t\" (UID: \"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" Apr 24 14:38:04.310685 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.310659 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5ada30a-c8f3-421f-b2a7-e901ed97b1d7-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t\" (UID: \"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" Apr 24 14:38:04.379864 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.379830 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" Apr 24 14:38:04.435021 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.434981 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" Apr 24 14:38:04.511615 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.511559 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8"] Apr 24 14:38:04.515644 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:38:04.515619 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c95b37_9d70_4ff8_bf3b_9fc158bf3b2c.slice/crio-30d97662db0b6ceedd67a5f39f6b07e1d6ce049e84f0b33b2272008217a19203 WatchSource:0}: Error finding container 30d97662db0b6ceedd67a5f39f6b07e1d6ce049e84f0b33b2272008217a19203: Status 404 returned error can't find the container with id 30d97662db0b6ceedd67a5f39f6b07e1d6ce049e84f0b33b2272008217a19203 Apr 24 14:38:04.572175 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.572150 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t"] Apr 24 14:38:04.575721 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:38:04.575694 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5ada30a_c8f3_421f_b2a7_e901ed97b1d7.slice/crio-745cf8301cb7ff17352bb4b83309267a177efaea193ae6a3cd9bb16d220b9732 WatchSource:0}: Error finding container 745cf8301cb7ff17352bb4b83309267a177efaea193ae6a3cd9bb16d220b9732: Status 404 returned error can't find the container with id 745cf8301cb7ff17352bb4b83309267a177efaea193ae6a3cd9bb16d220b9732 Apr 24 14:38:04.582327 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.582299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" event={"ID":"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c","Type":"ContainerStarted","Data":"30d97662db0b6ceedd67a5f39f6b07e1d6ce049e84f0b33b2272008217a19203"} Apr 24 14:38:04.583413 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:04.583391 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" event={"ID":"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7","Type":"ContainerStarted","Data":"745cf8301cb7ff17352bb4b83309267a177efaea193ae6a3cd9bb16d220b9732"} Apr 24 14:38:05.587285 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:05.587244 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" event={"ID":"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c","Type":"ContainerStarted","Data":"1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9"} Apr 24 14:38:05.588701 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:05.588672 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" event={"ID":"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7","Type":"ContainerStarted","Data":"2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1"} Apr 24 14:38:08.183192 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.183166 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" Apr 24 14:38:08.239562 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.239532 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0388cf9-12c9-4ee2-ac85-53ef90abe77c-kserve-provision-location\") pod \"c0388cf9-12c9-4ee2-ac85-53ef90abe77c\" (UID: \"c0388cf9-12c9-4ee2-ac85-53ef90abe77c\") " Apr 24 14:38:08.239923 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.239856 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0388cf9-12c9-4ee2-ac85-53ef90abe77c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c0388cf9-12c9-4ee2-ac85-53ef90abe77c" (UID: "c0388cf9-12c9-4ee2-ac85-53ef90abe77c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:38:08.341064 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.341027 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0388cf9-12c9-4ee2-ac85-53ef90abe77c-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:38:08.600364 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.600335 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerID="d45c18ddf162ad98c93a4807d86f81a95845d77978aa42e72243bd1fdf7f6d4c" exitCode=0 Apr 24 14:38:08.600485 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.600412 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" event={"ID":"b0829893-cbda-4b55-88b3-077ae1ea9f09","Type":"ContainerDied","Data":"d45c18ddf162ad98c93a4807d86f81a95845d77978aa42e72243bd1fdf7f6d4c"} Apr 24 14:38:08.602001 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.601973 2574 generic.go:358] "Generic (PLEG): container finished" podID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerID="420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96" exitCode=0 Apr 24 14:38:08.602102 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.602053 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" Apr 24 14:38:08.602153 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.602050 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" event={"ID":"c0388cf9-12c9-4ee2-ac85-53ef90abe77c","Type":"ContainerDied","Data":"420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96"} Apr 24 14:38:08.602188 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.602171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px" event={"ID":"c0388cf9-12c9-4ee2-ac85-53ef90abe77c","Type":"ContainerDied","Data":"4ee0702c1ee312c7b32a714a5985c160f2180b270efe3489d7219aa06445e6ca"} Apr 24 14:38:08.602230 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.602196 2574 scope.go:117] "RemoveContainer" containerID="420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96" Apr 24 14:38:08.603648 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.603628 2574 generic.go:358] "Generic (PLEG): container finished" podID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerID="1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9" exitCode=0 Apr 24 14:38:08.603754 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.603690 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" event={"ID":"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c","Type":"ContainerDied","Data":"1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9"} Apr 24 14:38:08.605360 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.605330 2574 generic.go:358] "Generic (PLEG): container finished" podID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerID="2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1" exitCode=0 Apr 24 14:38:08.605472 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.605364 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" event={"ID":"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7","Type":"ContainerDied","Data":"2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1"} Apr 24 14:38:08.616166 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.616144 2574 scope.go:117] "RemoveContainer" containerID="66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a" Apr 24 14:38:08.628999 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.628978 2574 scope.go:117] "RemoveContainer" containerID="420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96" Apr 24 14:38:08.629492 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:38:08.629463 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96\": container with ID starting with 420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96 not found: ID does not exist" containerID="420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96" Apr 24 14:38:08.629640 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.629505 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96"} err="failed to get container status \"420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96\": rpc error: code = NotFound desc = could not find container \"420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96\": container with ID starting with 420eae57e2ca526352c702cf3821f9417a5b1275e68e8831860b60e657c53c96 not found: ID does not exist" Apr 24 14:38:08.629640 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.629532 2574 scope.go:117] "RemoveContainer" containerID="66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a" Apr 24 14:38:08.629834 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:38:08.629809 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a\": container with ID starting with 66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a not found: ID does not exist" containerID="66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a" Apr 24 14:38:08.629977 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.629841 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a"} err="failed to get container status \"66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a\": rpc error: code = NotFound desc = could not find container \"66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a\": container with ID starting with 66fc354fc922d6bbbfd1e57b8ea3c3a31d1d8e30e71cb262b5478dfa8e88c63a not found: ID does not exist" Apr 24 14:38:08.651292 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.651262 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px"] Apr 24 14:38:08.655398 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.655372 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-a4202-predictor-548565d6f5-5m8px"] Apr 24 14:38:08.671771 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.671674 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" Apr 24 14:38:08.745106 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.745069 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0829893-cbda-4b55-88b3-077ae1ea9f09-kserve-provision-location\") pod \"b0829893-cbda-4b55-88b3-077ae1ea9f09\" (UID: \"b0829893-cbda-4b55-88b3-077ae1ea9f09\") " Apr 24 14:38:08.745399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.745373 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0829893-cbda-4b55-88b3-077ae1ea9f09-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0829893-cbda-4b55-88b3-077ae1ea9f09" (UID: "b0829893-cbda-4b55-88b3-077ae1ea9f09"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:38:08.845909 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:08.845798 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0829893-cbda-4b55-88b3-077ae1ea9f09-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:38:09.170291 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.170209 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" path="/var/lib/kubelet/pods/c0388cf9-12c9-4ee2-ac85-53ef90abe77c/volumes" Apr 24 14:38:09.615189 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.615150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" event={"ID":"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c","Type":"ContainerStarted","Data":"3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf"} Apr 24 14:38:09.615674 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.615489 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" Apr 24 14:38:09.616935 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.616906 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" event={"ID":"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7","Type":"ContainerStarted","Data":"ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e"} Apr 24 14:38:09.617049 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.616988 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 14:38:09.617237 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.617215 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" Apr 24 14:38:09.618295 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.618258 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 14:38:09.618413 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.618377 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" event={"ID":"b0829893-cbda-4b55-88b3-077ae1ea9f09","Type":"ContainerDied","Data":"639cfa9e75b0eeefc4f45b64986dc1a5c5399794470bacd70d4580597d732505"} Apr 24 14:38:09.618480 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.618417 2574 scope.go:117] "RemoveContainer" containerID="d45c18ddf162ad98c93a4807d86f81a95845d77978aa42e72243bd1fdf7f6d4c" Apr 24 14:38:09.618480 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.618431 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz" Apr 24 14:38:09.626181 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.626154 2574 scope.go:117] "RemoveContainer" containerID="8c2c1be4074007379e4d16adb06e2693afe992cec33f6d77bfabaf5a3f3b5446" Apr 24 14:38:09.632765 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.632716 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podStartSLOduration=5.632701158 podStartE2EDuration="5.632701158s" podCreationTimestamp="2026-04-24 14:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:38:09.630659553 +0000 UTC m=+817.069176415" watchObservedRunningTime="2026-04-24 14:38:09.632701158 +0000 UTC m=+817.071218018" Apr 24 14:38:09.645897 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.645832 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" podStartSLOduration=5.645821031 podStartE2EDuration="5.645821031s" podCreationTimestamp="2026-04-24 14:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:38:09.644349992 +0000 UTC m=+817.082866852" watchObservedRunningTime="2026-04-24 14:38:09.645821031 +0000 UTC m=+817.084337890" Apr 24 14:38:09.657022 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.657000 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz"] Apr 24 14:38:09.660705 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:09.660686 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-a4202-predictor-dfbfb6fcf-9fqpz"] Apr 24 14:38:10.622612 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:10.622570 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 14:38:10.623045 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:10.622570 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 14:38:11.170633 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:11.170593 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" path="/var/lib/kubelet/pods/b0829893-cbda-4b55-88b3-077ae1ea9f09/volumes" Apr 24 14:38:20.623336 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:20.623278 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 14:38:20.623719 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:20.623278 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 14:38:30.623511 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:30.623462 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 14:38:30.623914 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:30.623462 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 14:38:40.623441 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:40.623396 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 14:38:40.623948 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:40.623402 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 14:38:50.622927 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:50.622849 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 14:38:50.622927 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:38:50.622849 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 14:39:00.623542 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:00.623495 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 14:39:00.623999 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:00.623497 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 14:39:10.623503 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:10.623404 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 14:39:10.624026 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:10.624003 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" Apr 24 14:39:20.623716 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:20.623679 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" Apr 24 14:39:44.319067 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:44.319033 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t"] Apr 24 14:39:44.319567 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:44.319286 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" containerID="cri-o://ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e" gracePeriod=30 Apr 24 14:39:44.370145 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:44.370107 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8"] Apr 24 14:39:44.370415 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:44.370371 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" containerID="cri-o://3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf" gracePeriod=30 Apr 24 14:39:48.144407 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.144384 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" Apr 24 14:39:48.265165 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.265082 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5ada30a-c8f3-421f-b2a7-e901ed97b1d7-kserve-provision-location\") pod \"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7\" (UID: \"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7\") " Apr 24 14:39:48.265432 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.265405 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ada30a-c8f3-421f-b2a7-e901ed97b1d7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" (UID: "a5ada30a-c8f3-421f-b2a7-e901ed97b1d7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:39:48.365777 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.365725 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5ada30a-c8f3-421f-b2a7-e901ed97b1d7-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:39:48.903747 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.903725 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" Apr 24 14:39:48.914066 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.914036 2574 generic.go:358] "Generic (PLEG): container finished" podID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerID="3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf" exitCode=0 Apr 24 14:39:48.914174 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.914096 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" Apr 24 14:39:48.914174 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.914106 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" event={"ID":"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c","Type":"ContainerDied","Data":"3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf"} Apr 24 14:39:48.914174 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.914134 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8" event={"ID":"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c","Type":"ContainerDied","Data":"30d97662db0b6ceedd67a5f39f6b07e1d6ce049e84f0b33b2272008217a19203"} Apr 24 14:39:48.914174 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.914149 2574 scope.go:117] "RemoveContainer" containerID="3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf" Apr 24 14:39:48.915563 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.915541 2574 generic.go:358] "Generic (PLEG): container finished" podID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerID="ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e" exitCode=0 Apr 24 14:39:48.915652 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.915576 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" event={"ID":"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7","Type":"ContainerDied","Data":"ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e"} Apr 24 14:39:48.915652 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.915597 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" event={"ID":"a5ada30a-c8f3-421f-b2a7-e901ed97b1d7","Type":"ContainerDied","Data":"745cf8301cb7ff17352bb4b83309267a177efaea193ae6a3cd9bb16d220b9732"} Apr 24 14:39:48.915652 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.915610 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t" Apr 24 14:39:48.923434 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.923418 2574 scope.go:117] "RemoveContainer" containerID="1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9" Apr 24 14:39:48.931114 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.931092 2574 scope.go:117] "RemoveContainer" containerID="3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf" Apr 24 14:39:48.931399 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:39:48.931381 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf\": container with ID starting with 3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf not found: ID does not exist" containerID="3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf" Apr 24 14:39:48.931456 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.931406 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf"} err="failed to get container status \"3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf\": rpc error: code = NotFound desc = could not find container \"3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf\": container with ID starting with 3669797046ff54969d4c770f23baa55278865b330132f51f49ff9a34dfb044bf not found: ID does not exist" Apr 24 14:39:48.931456 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.931423 2574 scope.go:117] "RemoveContainer" containerID="1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9" Apr 24 14:39:48.931673 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:39:48.931654 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9\": container with ID starting with 1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9 not found: ID does not exist" containerID="1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9" Apr 24 14:39:48.931726 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.931678 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9"} err="failed to get container status \"1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9\": rpc error: code = NotFound desc = could not find container \"1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9\": container with ID starting with 1db9dd4c46dbd99559f4e848f8da4a20ab90f59299129e5f76da1e45b212f5a9 not found: ID does not exist" Apr 24 14:39:48.931726 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.931695 2574 scope.go:117] "RemoveContainer" containerID="ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e" Apr 24 14:39:48.938367 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.938348 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t"] Apr 24 14:39:48.944851 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.944766 2574 scope.go:117] "RemoveContainer" containerID="2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1" Apr 24 14:39:48.944851 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.944813 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-29d74-predictor-5bf5b4664c-k6j4t"] Apr 24 14:39:48.952582 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.952551 2574 scope.go:117] "RemoveContainer" containerID="ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e" Apr 24 14:39:48.952814 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:39:48.952796 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e\": container with ID starting with ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e not found: ID does not exist" containerID="ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e" Apr 24 14:39:48.952862 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.952823 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e"} err="failed to get container status \"ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e\": rpc error: code = NotFound desc = could not find container \"ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e\": container with ID starting with ae1dc8f65a4112c927ba604fbb588b5f4d96bf7bac2144f333667a893da8452e not found: ID does not exist" Apr 24 14:39:48.952862 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.952842 2574 scope.go:117] "RemoveContainer" containerID="2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1" Apr 24 14:39:48.953109 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:39:48.953090 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1\": container with ID starting with 2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1 not found: ID does not exist" containerID="2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1" Apr 24 14:39:48.953153 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:48.953116 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1"} err="failed to get container status \"2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1\": rpc error: code = NotFound desc = could not find container \"2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1\": container with ID starting with 2fcb3af98c781e8039fe01b78ca9edadafe76f27262e7855ff5cc19f6e93b0d1 not found: ID does not exist" Apr 24 14:39:49.070091 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:49.070057 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c-kserve-provision-location\") pod \"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c\" (UID: \"e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c\") " Apr 24 14:39:49.070377 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:49.070352 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" (UID: "e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:39:49.170187 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:49.170157 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" path="/var/lib/kubelet/pods/a5ada30a-c8f3-421f-b2a7-e901ed97b1d7/volumes" Apr 24 14:39:49.170780 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:49.170764 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:39:49.231509 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:49.231477 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8"] Apr 24 14:39:49.235683 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:49.235658 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-29d74-predictor-7895bbd4dc-8b2t8"] Apr 24 14:39:51.169656 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:51.169616 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" path="/var/lib/kubelet/pods/e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c/volumes" Apr 24 14:39:54.437792 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.437751 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx"] Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438094 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="storage-initializer" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438107 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="storage-initializer" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438122 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="storage-initializer" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438127 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="storage-initializer" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438135 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="storage-initializer" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438141 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="storage-initializer" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438149 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438154 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438164 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438169 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438182 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438187 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438192 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="storage-initializer" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438197 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="storage-initializer" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438202 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" Apr 24 14:39:54.438245 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438207 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" Apr 24 14:39:54.438726 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438254 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5ada30a-c8f3-421f-b2a7-e901ed97b1d7" containerName="kserve-container" Apr 24 14:39:54.438726 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438264 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3c95b37-9d70-4ff8-bf3b-9fc158bf3b2c" containerName="kserve-container" Apr 24 14:39:54.438726 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438270 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0829893-cbda-4b55-88b3-077ae1ea9f09" containerName="kserve-container" Apr 24 14:39:54.438726 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.438277 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0388cf9-12c9-4ee2-ac85-53ef90abe77c" containerName="kserve-container" Apr 24 14:39:54.443009 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.442987 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:39:54.444924 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.444862 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qcgw2\"" Apr 24 14:39:54.452751 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.452720 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx"] Apr 24 14:39:54.609408 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.609374 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/620113a7-5642-4e53-8146-88ebb8525670-kserve-provision-location\") pod \"isvc-logger-raw-cf657-predictor-5749968549-bdvcx\" (UID: \"620113a7-5642-4e53-8146-88ebb8525670\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:39:54.710149 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.710061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/620113a7-5642-4e53-8146-88ebb8525670-kserve-provision-location\") pod \"isvc-logger-raw-cf657-predictor-5749968549-bdvcx\" (UID: \"620113a7-5642-4e53-8146-88ebb8525670\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:39:54.710449 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.710428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/620113a7-5642-4e53-8146-88ebb8525670-kserve-provision-location\") pod \"isvc-logger-raw-cf657-predictor-5749968549-bdvcx\" (UID: \"620113a7-5642-4e53-8146-88ebb8525670\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:39:54.753151 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.753117 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:39:54.873017 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.872983 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx"] Apr 24 14:39:54.876250 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:39:54.876224 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod620113a7_5642_4e53_8146_88ebb8525670.slice/crio-3050ea19638a0649c7d427fcb8fbcc264184b981133f736c517758c5b5726463 WatchSource:0}: Error finding container 3050ea19638a0649c7d427fcb8fbcc264184b981133f736c517758c5b5726463: Status 404 returned error can't find the container with id 3050ea19638a0649c7d427fcb8fbcc264184b981133f736c517758c5b5726463 Apr 24 14:39:54.935174 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:54.935150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" event={"ID":"620113a7-5642-4e53-8146-88ebb8525670","Type":"ContainerStarted","Data":"3050ea19638a0649c7d427fcb8fbcc264184b981133f736c517758c5b5726463"} Apr 24 14:39:55.939203 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:55.939161 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" event={"ID":"620113a7-5642-4e53-8146-88ebb8525670","Type":"ContainerStarted","Data":"117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3"} Apr 24 14:39:58.949959 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:58.949847 2574 generic.go:358] "Generic (PLEG): container finished" podID="620113a7-5642-4e53-8146-88ebb8525670" containerID="117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3" exitCode=0 Apr 24 14:39:58.949959 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:58.949913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" event={"ID":"620113a7-5642-4e53-8146-88ebb8525670","Type":"ContainerDied","Data":"117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3"} Apr 24 14:39:59.955121 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:59.955086 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" event={"ID":"620113a7-5642-4e53-8146-88ebb8525670","Type":"ContainerStarted","Data":"f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac"} Apr 24 14:39:59.955121 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:59.955128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" event={"ID":"620113a7-5642-4e53-8146-88ebb8525670","Type":"ContainerStarted","Data":"bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1"} Apr 24 14:39:59.955561 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:59.955540 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:39:59.955615 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:59.955572 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:39:59.956996 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:59.956968 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:39:59.957662 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:59.957633 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:39:59.973808 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:39:59.973770 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podStartSLOduration=5.973757321 podStartE2EDuration="5.973757321s" podCreationTimestamp="2026-04-24 14:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:39:59.972271003 +0000 UTC m=+927.410787861" watchObservedRunningTime="2026-04-24 14:39:59.973757321 +0000 UTC m=+927.412274179" Apr 24 14:40:00.957990 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:00.957947 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:40:00.958431 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:00.958318 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:40:10.957968 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:10.957915 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:40:10.960492 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:10.958367 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:40:20.959049 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:20.958931 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:40:20.959567 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:20.959537 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:40:30.958359 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:30.958304 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:40:30.958758 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:30.958727 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:40:40.958246 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:40.958146 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:40:40.958667 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:40.958604 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:40:50.957977 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:50.957923 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:40:50.958426 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:40:50.958399 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:41:00.958218 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:00.958171 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:41:00.958767 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:00.958637 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:41:10.959104 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:10.959070 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:41:10.959521 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:10.959455 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:41:19.650842 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:19.650805 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx"] Apr 24 14:41:19.651230 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:19.651166 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" containerID="cri-o://bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1" gracePeriod=30 Apr 24 14:41:19.651292 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:19.651242 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" containerID="cri-o://f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac" gracePeriod=30 Apr 24 14:41:19.695010 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:19.694977 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778"] Apr 24 14:41:19.698187 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:19.698166 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" Apr 24 14:41:19.706096 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:19.706066 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778"] Apr 24 14:41:19.781052 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:19.781016 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3ebb39c-aa31-41ed-9dea-5d2234505328-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778\" (UID: \"a3ebb39c-aa31-41ed-9dea-5d2234505328\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" Apr 24 14:41:19.882283 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:19.882245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3ebb39c-aa31-41ed-9dea-5d2234505328-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778\" (UID: \"a3ebb39c-aa31-41ed-9dea-5d2234505328\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" Apr 24 14:41:19.882619 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:19.882597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3ebb39c-aa31-41ed-9dea-5d2234505328-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778\" (UID: \"a3ebb39c-aa31-41ed-9dea-5d2234505328\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" Apr 24 14:41:20.008760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:20.008637 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" Apr 24 14:41:20.127628 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:20.127600 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778"] Apr 24 14:41:20.130232 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:41:20.130190 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3ebb39c_aa31_41ed_9dea_5d2234505328.slice/crio-cb46e6735797035ce687bc3e18e167074feeea9a788e90d1282d0108b91f7221 WatchSource:0}: Error finding container cb46e6735797035ce687bc3e18e167074feeea9a788e90d1282d0108b91f7221: Status 404 returned error can't find the container with id cb46e6735797035ce687bc3e18e167074feeea9a788e90d1282d0108b91f7221 Apr 24 14:41:20.189012 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:20.188984 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" event={"ID":"a3ebb39c-aa31-41ed-9dea-5d2234505328","Type":"ContainerStarted","Data":"cb46e6735797035ce687bc3e18e167074feeea9a788e90d1282d0108b91f7221"} Apr 24 14:41:20.958477 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:20.958433 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:41:20.960019 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:20.959991 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:41:21.193523 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:21.193482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" event={"ID":"a3ebb39c-aa31-41ed-9dea-5d2234505328","Type":"ContainerStarted","Data":"7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0"} Apr 24 14:41:24.203040 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:24.202937 2574 generic.go:358] "Generic (PLEG): container finished" podID="620113a7-5642-4e53-8146-88ebb8525670" containerID="bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1" exitCode=0 Apr 24 14:41:24.203040 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:24.203012 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" event={"ID":"620113a7-5642-4e53-8146-88ebb8525670","Type":"ContainerDied","Data":"bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1"} Apr 24 14:41:24.204277 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:24.204253 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerID="7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0" exitCode=0 Apr 24 14:41:24.204388 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:24.204315 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" event={"ID":"a3ebb39c-aa31-41ed-9dea-5d2234505328","Type":"ContainerDied","Data":"7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0"} Apr 24 14:41:25.209341 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:25.209293 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" event={"ID":"a3ebb39c-aa31-41ed-9dea-5d2234505328","Type":"ContainerStarted","Data":"9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74"} Apr 24 14:41:25.209763 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:25.209590 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" Apr 24 14:41:25.210713 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:25.210690 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:41:25.225916 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:25.225849 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podStartSLOduration=6.22583653 podStartE2EDuration="6.22583653s" podCreationTimestamp="2026-04-24 14:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:41:25.223680407 +0000 UTC m=+1012.662197267" watchObservedRunningTime="2026-04-24 14:41:25.22583653 +0000 UTC m=+1012.664353478" Apr 24 14:41:26.212724 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:26.212683 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:41:30.958090 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:30.958044 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:41:30.960741 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:30.960712 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:41:36.213186 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:36.213139 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:41:40.958639 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:40.958586 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 14:41:40.959113 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:40.958787 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:41:40.960178 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:40.960153 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:41:40.960273 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:40.960251 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:41:46.213658 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:46.213602 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:41:49.826320 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:49.826293 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:41:49.917798 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:49.917701 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/620113a7-5642-4e53-8146-88ebb8525670-kserve-provision-location\") pod \"620113a7-5642-4e53-8146-88ebb8525670\" (UID: \"620113a7-5642-4e53-8146-88ebb8525670\") " Apr 24 14:41:49.918040 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:49.918016 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/620113a7-5642-4e53-8146-88ebb8525670-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "620113a7-5642-4e53-8146-88ebb8525670" (UID: "620113a7-5642-4e53-8146-88ebb8525670"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:41:50.018895 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.018851 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/620113a7-5642-4e53-8146-88ebb8525670-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:41:50.290303 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.290219 2574 generic.go:358] "Generic (PLEG): container finished" podID="620113a7-5642-4e53-8146-88ebb8525670" containerID="f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac" exitCode=137 Apr 24 14:41:50.290303 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.290276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" event={"ID":"620113a7-5642-4e53-8146-88ebb8525670","Type":"ContainerDied","Data":"f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac"} Apr 24 14:41:50.290303 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.290292 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" Apr 24 14:41:50.290303 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.290303 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx" event={"ID":"620113a7-5642-4e53-8146-88ebb8525670","Type":"ContainerDied","Data":"3050ea19638a0649c7d427fcb8fbcc264184b981133f736c517758c5b5726463"} Apr 24 14:41:50.290646 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.290320 2574 scope.go:117] "RemoveContainer" containerID="f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac" Apr 24 14:41:50.298326 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.298303 2574 scope.go:117] "RemoveContainer" containerID="bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1" Apr 24 14:41:50.305441 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.305423 2574 scope.go:117] "RemoveContainer" containerID="117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3" Apr 24 14:41:50.311804 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.311778 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx"] Apr 24 14:41:50.313334 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.313312 2574 scope.go:117] "RemoveContainer" containerID="f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac" Apr 24 14:41:50.313614 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:41:50.313589 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac\": container with ID starting with f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac not found: ID does not exist" containerID="f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac" Apr 24 14:41:50.313667 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.313629 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac"} err="failed to get container status \"f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac\": rpc error: code = NotFound desc = could not find container \"f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac\": container with ID starting with f0bff37597505abbe6bf37f65ffd529b230d0a78e819ae29c0f67f44c0e396ac not found: ID does not exist" Apr 24 14:41:50.313667 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.313658 2574 scope.go:117] "RemoveContainer" containerID="bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1" Apr 24 14:41:50.313926 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:41:50.313902 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1\": container with ID starting with bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1 not found: ID does not exist" containerID="bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1" Apr 24 14:41:50.314049 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.313934 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1"} err="failed to get container status \"bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1\": rpc error: code = NotFound desc = could not find container \"bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1\": container with ID starting with bf6967cbea658d4a0a5279c10c323199d4f60ad0f681de110ef45691d13d06e1 not found: ID does not exist" Apr 24 14:41:50.314049 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.313957 2574 scope.go:117] "RemoveContainer" containerID="117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3" Apr 24 14:41:50.314225 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:41:50.314202 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3\": container with ID starting with 117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3 not found: ID does not exist" containerID="117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3" Apr 24 14:41:50.314292 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.314236 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3"} err="failed to get container status \"117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3\": rpc error: code = NotFound desc = could not find container \"117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3\": container with ID starting with 117bef53c04d09a2717200bd200d59579b7283f0f1663703c54388d72c2988b3 not found: ID does not exist" Apr 24 14:41:50.316016 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:50.315995 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cf657-predictor-5749968549-bdvcx"] Apr 24 14:41:51.170148 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:51.170112 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="620113a7-5642-4e53-8146-88ebb8525670" path="/var/lib/kubelet/pods/620113a7-5642-4e53-8146-88ebb8525670/volumes" Apr 24 14:41:56.213348 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:41:56.213296 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:42:06.213654 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:42:06.213609 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:42:16.212773 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:42:16.212675 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:42:26.213052 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:42:26.212999 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:42:27.166183 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:42:27.166142 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:42:37.166825 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:42:37.166775 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:42:47.166460 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:42:47.166408 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:42:57.167176 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:42:57.167133 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:43:07.166551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:07.166498 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:43:17.166405 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:17.166359 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:43:27.166252 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:27.166202 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:43:28.166229 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:28.166181 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:43:38.166386 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:38.166336 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:43:48.167400 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:48.167357 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" Apr 24 14:43:49.854614 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.854580 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778"] Apr 24 14:43:49.855047 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.854850 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" containerID="cri-o://9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74" gracePeriod=30 Apr 24 14:43:49.950095 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.950061 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65"] Apr 24 14:43:49.950384 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.950371 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" Apr 24 14:43:49.950430 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.950385 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" Apr 24 14:43:49.950430 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.950401 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" Apr 24 14:43:49.950430 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.950407 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" Apr 24 14:43:49.950430 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.950423 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="storage-initializer" Apr 24 14:43:49.950430 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.950430 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="storage-initializer" Apr 24 14:43:49.950587 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.950477 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="kserve-container" Apr 24 14:43:49.950587 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.950484 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="620113a7-5642-4e53-8146-88ebb8525670" containerName="agent" Apr 24 14:43:49.953413 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.953394 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" Apr 24 14:43:49.961115 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.961088 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65"] Apr 24 14:43:49.961220 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:49.961185 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29-kserve-provision-location\") pod \"isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65\" (UID: \"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29\") " pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" Apr 24 14:43:50.062594 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:50.062548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29-kserve-provision-location\") pod \"isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65\" (UID: \"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29\") " pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" Apr 24 14:43:50.062991 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:50.062967 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29-kserve-provision-location\") pod \"isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65\" (UID: \"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29\") " pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" Apr 24 14:43:50.264077 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:50.263974 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" Apr 24 14:43:50.393051 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:50.393016 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65"] Apr 24 14:43:50.396289 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:43:50.396257 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed1b3fd_1a1c_4ab8_a44d_bd6efbf52a29.slice/crio-e8157c18c5d5078cacd45349ab2e09df714f62378ab755707ec2ba8d2db8b334 WatchSource:0}: Error finding container e8157c18c5d5078cacd45349ab2e09df714f62378ab755707ec2ba8d2db8b334: Status 404 returned error can't find the container with id e8157c18c5d5078cacd45349ab2e09df714f62378ab755707ec2ba8d2db8b334 Apr 24 14:43:50.398079 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:50.398061 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:43:50.628281 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:50.628237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" event={"ID":"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29","Type":"ContainerStarted","Data":"0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399"} Apr 24 14:43:50.628281 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:50.628282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" event={"ID":"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29","Type":"ContainerStarted","Data":"e8157c18c5d5078cacd45349ab2e09df714f62378ab755707ec2ba8d2db8b334"} Apr 24 14:43:54.641045 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:54.641012 2574 generic.go:358] "Generic (PLEG): container finished" podID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerID="0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399" exitCode=0 Apr 24 14:43:54.641433 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:54.641098 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" event={"ID":"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29","Type":"ContainerDied","Data":"0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399"} Apr 24 14:43:55.645747 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:55.645710 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" event={"ID":"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29","Type":"ContainerStarted","Data":"5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11"} Apr 24 14:43:55.646174 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:55.646053 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" Apr 24 14:43:55.647446 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:55.647421 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 14:43:55.661659 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:55.661612 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podStartSLOduration=6.661597506 podStartE2EDuration="6.661597506s" podCreationTimestamp="2026-04-24 14:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:43:55.660111516 +0000 UTC m=+1163.098628387" watchObservedRunningTime="2026-04-24 14:43:55.661597506 +0000 UTC m=+1163.100114366" Apr 24 14:43:56.649126 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:56.649083 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 14:43:58.166432 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:58.166387 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 14:43:59.393988 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.393965 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" Apr 24 14:43:59.436715 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.436683 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3ebb39c-aa31-41ed-9dea-5d2234505328-kserve-provision-location\") pod \"a3ebb39c-aa31-41ed-9dea-5d2234505328\" (UID: \"a3ebb39c-aa31-41ed-9dea-5d2234505328\") " Apr 24 14:43:59.437017 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.436993 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3ebb39c-aa31-41ed-9dea-5d2234505328-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a3ebb39c-aa31-41ed-9dea-5d2234505328" (UID: "a3ebb39c-aa31-41ed-9dea-5d2234505328"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:43:59.537265 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.537164 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3ebb39c-aa31-41ed-9dea-5d2234505328-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:43:59.658491 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.658456 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerID="9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74" exitCode=0 Apr 24 14:43:59.658651 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.658519 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" Apr 24 14:43:59.658651 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.658537 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" event={"ID":"a3ebb39c-aa31-41ed-9dea-5d2234505328","Type":"ContainerDied","Data":"9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74"} Apr 24 14:43:59.658651 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.658579 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778" event={"ID":"a3ebb39c-aa31-41ed-9dea-5d2234505328","Type":"ContainerDied","Data":"cb46e6735797035ce687bc3e18e167074feeea9a788e90d1282d0108b91f7221"} Apr 24 14:43:59.658651 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.658595 2574 scope.go:117] "RemoveContainer" containerID="9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74" Apr 24 14:43:59.666574 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.666556 2574 scope.go:117] "RemoveContainer" containerID="7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0" Apr 24 14:43:59.673153 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.673137 2574 scope.go:117] "RemoveContainer" containerID="9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74" Apr 24 14:43:59.673410 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:43:59.673391 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74\": container with ID starting with 9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74 not found: ID does not exist" containerID="9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74" Apr 24 14:43:59.673454 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.673420 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74"} err="failed to get container status \"9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74\": rpc error: code = NotFound desc = could not find container \"9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74\": container with ID starting with 9f04dc2429028f1d863c4a09608d9d85ecbbefb1b68daa2e5bc261fba6a7db74 not found: ID does not exist" Apr 24 14:43:59.673454 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.673438 2574 scope.go:117] "RemoveContainer" containerID="7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0" Apr 24 14:43:59.673669 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:43:59.673654 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0\": container with ID starting with 7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0 not found: ID does not exist" containerID="7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0" Apr 24 14:43:59.673706 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.673677 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0"} err="failed to get container status \"7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0\": rpc error: code = NotFound desc = could not find container \"7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0\": container with ID starting with 7eb0efecf1da6e1133c597efb6fe77cd1875bc1b29a1048722927a0c4e56a2c0 not found: ID does not exist" Apr 24 14:43:59.681925 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.681905 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778"] Apr 24 14:43:59.688017 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:43:59.687991 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2a9b2-predictor-5f6f8cb7c-xn778"] Apr 24 14:44:01.169573 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:44:01.169536 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" path="/var/lib/kubelet/pods/a3ebb39c-aa31-41ed-9dea-5d2234505328/volumes" Apr 24 14:44:06.649495 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:44:06.649447 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 14:44:16.649037 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:44:16.648987 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 14:44:26.649865 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:44:26.649814 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 14:44:36.649994 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:44:36.649950 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 14:44:46.650063 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:44:46.650017 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 14:44:56.649483 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:44:56.649436 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 14:45:06.650745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:06.650712 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" Apr 24 14:45:10.089364 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.089323 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn"] Apr 24 14:45:10.089833 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.089757 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="storage-initializer" Apr 24 14:45:10.089833 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.089775 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="storage-initializer" Apr 24 14:45:10.089833 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.089794 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" Apr 24 14:45:10.089833 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.089802 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" Apr 24 14:45:10.090087 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.089899 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3ebb39c-aa31-41ed-9dea-5d2234505328" containerName="kserve-container" Apr 24 14:45:10.093000 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.092978 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:10.094775 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.094753 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-c6676b\"" Apr 24 14:45:10.094915 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.094855 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-c6676b-dockercfg-lmt9t\"" Apr 24 14:45:10.094974 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.094957 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 14:45:10.100403 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.100375 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn"] Apr 24 14:45:10.197217 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.197174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e3fecf5-c573-4500-8493-843d4230c9b1-cabundle-cert\") pod \"isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn\" (UID: \"8e3fecf5-c573-4500-8493-843d4230c9b1\") " pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:10.197443 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.197355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e3fecf5-c573-4500-8493-843d4230c9b1-kserve-provision-location\") pod \"isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn\" (UID: \"8e3fecf5-c573-4500-8493-843d4230c9b1\") " pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:10.298133 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.298096 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e3fecf5-c573-4500-8493-843d4230c9b1-cabundle-cert\") pod \"isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn\" (UID: \"8e3fecf5-c573-4500-8493-843d4230c9b1\") " pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:10.298337 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.298154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e3fecf5-c573-4500-8493-843d4230c9b1-kserve-provision-location\") pod \"isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn\" (UID: \"8e3fecf5-c573-4500-8493-843d4230c9b1\") " pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:10.298500 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.298483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e3fecf5-c573-4500-8493-843d4230c9b1-kserve-provision-location\") pod \"isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn\" (UID: \"8e3fecf5-c573-4500-8493-843d4230c9b1\") " pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:10.298745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.298722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e3fecf5-c573-4500-8493-843d4230c9b1-cabundle-cert\") pod \"isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn\" (UID: \"8e3fecf5-c573-4500-8493-843d4230c9b1\") " pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:10.403824 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.403707 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:10.520382 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.520348 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn"] Apr 24 14:45:10.523438 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:45:10.523401 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e3fecf5_c573_4500_8493_843d4230c9b1.slice/crio-48317e3e7e5f9fff8dd82f198c49627830a8ffff77b67f03b40be7ac6c0e45d6 WatchSource:0}: Error finding container 48317e3e7e5f9fff8dd82f198c49627830a8ffff77b67f03b40be7ac6c0e45d6: Status 404 returned error can't find the container with id 48317e3e7e5f9fff8dd82f198c49627830a8ffff77b67f03b40be7ac6c0e45d6 Apr 24 14:45:10.866338 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.866297 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" event={"ID":"8e3fecf5-c573-4500-8493-843d4230c9b1","Type":"ContainerStarted","Data":"512e6ab3a364027bb80b14d7b3987bae7f6f39979585697139198af44627eecf"} Apr 24 14:45:10.866338 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:10.866337 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" event={"ID":"8e3fecf5-c573-4500-8493-843d4230c9b1","Type":"ContainerStarted","Data":"48317e3e7e5f9fff8dd82f198c49627830a8ffff77b67f03b40be7ac6c0e45d6"} Apr 24 14:45:12.872841 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:12.872816 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_8e3fecf5-c573-4500-8493-843d4230c9b1/storage-initializer/0.log" Apr 24 14:45:12.873324 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:12.872853 2574 generic.go:358] "Generic (PLEG): container finished" podID="8e3fecf5-c573-4500-8493-843d4230c9b1" containerID="512e6ab3a364027bb80b14d7b3987bae7f6f39979585697139198af44627eecf" exitCode=1 Apr 24 14:45:12.873324 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:12.872939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" event={"ID":"8e3fecf5-c573-4500-8493-843d4230c9b1","Type":"ContainerDied","Data":"512e6ab3a364027bb80b14d7b3987bae7f6f39979585697139198af44627eecf"} Apr 24 14:45:13.877537 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:13.877509 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_8e3fecf5-c573-4500-8493-843d4230c9b1/storage-initializer/0.log" Apr 24 14:45:13.877928 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:13.877586 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" event={"ID":"8e3fecf5-c573-4500-8493-843d4230c9b1","Type":"ContainerStarted","Data":"1d3904cfdae7c202fbbbaeea4560865e6e67309738d0018d852838d62cb15958"} Apr 24 14:45:16.888002 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:16.887974 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_8e3fecf5-c573-4500-8493-843d4230c9b1/storage-initializer/1.log" Apr 24 14:45:16.888399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:16.888346 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_8e3fecf5-c573-4500-8493-843d4230c9b1/storage-initializer/0.log" Apr 24 14:45:16.888399 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:16.888381 2574 generic.go:358] "Generic (PLEG): container finished" podID="8e3fecf5-c573-4500-8493-843d4230c9b1" containerID="1d3904cfdae7c202fbbbaeea4560865e6e67309738d0018d852838d62cb15958" exitCode=1 Apr 24 14:45:16.888478 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:16.888460 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" event={"ID":"8e3fecf5-c573-4500-8493-843d4230c9b1","Type":"ContainerDied","Data":"1d3904cfdae7c202fbbbaeea4560865e6e67309738d0018d852838d62cb15958"} Apr 24 14:45:16.888512 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:16.888502 2574 scope.go:117] "RemoveContainer" containerID="512e6ab3a364027bb80b14d7b3987bae7f6f39979585697139198af44627eecf" Apr 24 14:45:16.888867 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:16.888845 2574 scope.go:117] "RemoveContainer" containerID="512e6ab3a364027bb80b14d7b3987bae7f6f39979585697139198af44627eecf" Apr 24 14:45:16.901469 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:45:16.901436 2574 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_kserve-ci-e2e-test_8e3fecf5-c573-4500-8493-843d4230c9b1_0 in pod sandbox 48317e3e7e5f9fff8dd82f198c49627830a8ffff77b67f03b40be7ac6c0e45d6 from index: no such id: '512e6ab3a364027bb80b14d7b3987bae7f6f39979585697139198af44627eecf'" containerID="512e6ab3a364027bb80b14d7b3987bae7f6f39979585697139198af44627eecf" Apr 24 14:45:16.901565 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:45:16.901487 2574 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_kserve-ci-e2e-test_8e3fecf5-c573-4500-8493-843d4230c9b1_0 in pod sandbox 48317e3e7e5f9fff8dd82f198c49627830a8ffff77b67f03b40be7ac6c0e45d6 from index: no such id: '512e6ab3a364027bb80b14d7b3987bae7f6f39979585697139198af44627eecf'; Skipping pod \"isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_kserve-ci-e2e-test(8e3fecf5-c573-4500-8493-843d4230c9b1)\"" logger="UnhandledError" Apr 24 14:45:16.902823 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:45:16.902801 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_kserve-ci-e2e-test(8e3fecf5-c573-4500-8493-843d4230c9b1)\"" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" podUID="8e3fecf5-c573-4500-8493-843d4230c9b1" Apr 24 14:45:17.893461 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:17.893437 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_8e3fecf5-c573-4500-8493-843d4230c9b1/storage-initializer/1.log" Apr 24 14:45:24.180976 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.180938 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn"] Apr 24 14:45:24.229394 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.229361 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65"] Apr 24 14:45:24.229651 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.229628 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" containerID="cri-o://5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11" gracePeriod=30 Apr 24 14:45:24.311563 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.311536 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k"] Apr 24 14:45:24.316122 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.316101 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:24.318193 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.318163 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-3988b2\"" Apr 24 14:45:24.318295 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.318164 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-3988b2-dockercfg-56v4c\"" Apr 24 14:45:24.319304 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.319289 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_8e3fecf5-c573-4500-8493-843d4230c9b1/storage-initializer/1.log" Apr 24 14:45:24.319394 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.319338 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:24.325086 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.325068 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k"] Apr 24 14:45:24.414373 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.414340 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e3fecf5-c573-4500-8493-843d4230c9b1-kserve-provision-location\") pod \"8e3fecf5-c573-4500-8493-843d4230c9b1\" (UID: \"8e3fecf5-c573-4500-8493-843d4230c9b1\") " Apr 24 14:45:24.414575 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.414434 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e3fecf5-c573-4500-8493-843d4230c9b1-cabundle-cert\") pod \"8e3fecf5-c573-4500-8493-843d4230c9b1\" (UID: \"8e3fecf5-c573-4500-8493-843d4230c9b1\") " Apr 24 14:45:24.414649 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.414589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3da90f3-7d28-4390-8bac-c36d5ff927c3-kserve-provision-location\") pod \"isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k\" (UID: \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\") " pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:24.414649 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.414599 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e3fecf5-c573-4500-8493-843d4230c9b1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8e3fecf5-c573-4500-8493-843d4230c9b1" (UID: "8e3fecf5-c573-4500-8493-843d4230c9b1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:45:24.414749 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.414714 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3da90f3-7d28-4390-8bac-c36d5ff927c3-cabundle-cert\") pod \"isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k\" (UID: \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\") " pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:24.414805 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.414767 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e3fecf5-c573-4500-8493-843d4230c9b1-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "8e3fecf5-c573-4500-8493-843d4230c9b1" (UID: "8e3fecf5-c573-4500-8493-843d4230c9b1"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:45:24.414857 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.414824 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e3fecf5-c573-4500-8493-843d4230c9b1-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:45:24.516176 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.516093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3da90f3-7d28-4390-8bac-c36d5ff927c3-kserve-provision-location\") pod \"isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k\" (UID: \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\") " pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:24.516176 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.516152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3da90f3-7d28-4390-8bac-c36d5ff927c3-cabundle-cert\") pod \"isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k\" (UID: \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\") " pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:24.516354 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.516182 2574 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e3fecf5-c573-4500-8493-843d4230c9b1-cabundle-cert\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:45:24.516490 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.516468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3da90f3-7d28-4390-8bac-c36d5ff927c3-kserve-provision-location\") pod \"isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k\" (UID: \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\") " pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:24.516734 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.516718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3da90f3-7d28-4390-8bac-c36d5ff927c3-cabundle-cert\") pod \"isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k\" (UID: \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\") " pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:24.628660 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.628624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:24.750066 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.750031 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k"] Apr 24 14:45:24.753288 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:45:24.753257 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3da90f3_7d28_4390_8bac_c36d5ff927c3.slice/crio-efe561a8473a8f9177aabb994d81d092ddc298f892f7dfcb4aeafc0780533da2 WatchSource:0}: Error finding container efe561a8473a8f9177aabb994d81d092ddc298f892f7dfcb4aeafc0780533da2: Status 404 returned error can't find the container with id efe561a8473a8f9177aabb994d81d092ddc298f892f7dfcb4aeafc0780533da2 Apr 24 14:45:24.916173 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.916145 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn_8e3fecf5-c573-4500-8493-843d4230c9b1/storage-initializer/1.log" Apr 24 14:45:24.916368 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.916278 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" event={"ID":"8e3fecf5-c573-4500-8493-843d4230c9b1","Type":"ContainerDied","Data":"48317e3e7e5f9fff8dd82f198c49627830a8ffff77b67f03b40be7ac6c0e45d6"} Apr 24 14:45:24.916368 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.916322 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn" Apr 24 14:45:24.916460 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.916329 2574 scope.go:117] "RemoveContainer" containerID="1d3904cfdae7c202fbbbaeea4560865e6e67309738d0018d852838d62cb15958" Apr 24 14:45:24.917788 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.917755 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" event={"ID":"a3da90f3-7d28-4390-8bac-c36d5ff927c3","Type":"ContainerStarted","Data":"9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d"} Apr 24 14:45:24.917788 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.917786 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" event={"ID":"a3da90f3-7d28-4390-8bac-c36d5ff927c3","Type":"ContainerStarted","Data":"efe561a8473a8f9177aabb994d81d092ddc298f892f7dfcb4aeafc0780533da2"} Apr 24 14:45:24.960116 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.960083 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn"] Apr 24 14:45:24.963936 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:24.963911 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6676b-predictor-6df7fcb885-6g8bn"] Apr 24 14:45:25.170755 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:25.170709 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e3fecf5-c573-4500-8493-843d4230c9b1" path="/var/lib/kubelet/pods/8e3fecf5-c573-4500-8493-843d4230c9b1/volumes" Apr 24 14:45:26.649970 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:26.649931 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 14:45:28.882739 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.882715 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" Apr 24 14:45:28.931770 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.931732 2574 generic.go:358] "Generic (PLEG): container finished" podID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerID="5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11" exitCode=0 Apr 24 14:45:28.931983 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.931805 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" Apr 24 14:45:28.931983 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.931815 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" event={"ID":"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29","Type":"ContainerDied","Data":"5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11"} Apr 24 14:45:28.931983 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.931856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65" event={"ID":"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29","Type":"ContainerDied","Data":"e8157c18c5d5078cacd45349ab2e09df714f62378ab755707ec2ba8d2db8b334"} Apr 24 14:45:28.931983 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.931874 2574 scope.go:117] "RemoveContainer" containerID="5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11" Apr 24 14:45:28.941288 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.941270 2574 scope.go:117] "RemoveContainer" containerID="0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399" Apr 24 14:45:28.948096 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.948080 2574 scope.go:117] "RemoveContainer" containerID="5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11" Apr 24 14:45:28.948335 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:45:28.948316 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11\": container with ID starting with 5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11 not found: ID does not exist" containerID="5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11" Apr 24 14:45:28.948378 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.948346 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11"} err="failed to get container status \"5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11\": rpc error: code = NotFound desc = could not find container \"5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11\": container with ID starting with 5a7d1a5e44ea9e662c2ab205f7056f2818daa8cfd982c3c06cf3ed87a8fdfc11 not found: ID does not exist" Apr 24 14:45:28.948378 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.948370 2574 scope.go:117] "RemoveContainer" containerID="0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399" Apr 24 14:45:28.948603 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:45:28.948582 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399\": container with ID starting with 0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399 not found: ID does not exist" containerID="0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399" Apr 24 14:45:28.948649 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.948610 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399"} err="failed to get container status \"0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399\": rpc error: code = NotFound desc = could not find container \"0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399\": container with ID starting with 0e7e285fbcda97566cbdc26f7cae19e93764ab67cee0494baeb9e1ffa2993399 not found: ID does not exist" Apr 24 14:45:28.953956 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.953941 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29-kserve-provision-location\") pod \"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29\" (UID: \"3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29\") " Apr 24 14:45:28.954220 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:28.954203 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" (UID: "3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:45:29.055047 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:29.054957 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:45:29.248461 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:29.248428 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65"] Apr 24 14:45:29.253584 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:29.253559 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6676b-predictor-5dc44dd8b8-7jh65"] Apr 24 14:45:30.939946 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:30.939853 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k_a3da90f3-7d28-4390-8bac-c36d5ff927c3/storage-initializer/0.log" Apr 24 14:45:30.939946 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:30.939911 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" containerID="9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d" exitCode=1 Apr 24 14:45:30.940319 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:30.939995 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" event={"ID":"a3da90f3-7d28-4390-8bac-c36d5ff927c3","Type":"ContainerDied","Data":"9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d"} Apr 24 14:45:31.169928 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:31.169859 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" path="/var/lib/kubelet/pods/3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29/volumes" Apr 24 14:45:31.943892 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:31.943863 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k_a3da90f3-7d28-4390-8bac-c36d5ff927c3/storage-initializer/0.log" Apr 24 14:45:31.944267 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:31.943937 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" event={"ID":"a3da90f3-7d28-4390-8bac-c36d5ff927c3","Type":"ContainerStarted","Data":"666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f"} Apr 24 14:45:34.352553 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.352511 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k"] Apr 24 14:45:34.352972 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.352807 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" podUID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" containerName="storage-initializer" containerID="cri-o://666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f" gracePeriod=30 Apr 24 14:45:34.487407 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487362 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6"] Apr 24 14:45:34.487697 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487682 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e3fecf5-c573-4500-8493-843d4230c9b1" containerName="storage-initializer" Apr 24 14:45:34.487745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487699 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e3fecf5-c573-4500-8493-843d4230c9b1" containerName="storage-initializer" Apr 24 14:45:34.487745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487714 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" Apr 24 14:45:34.487745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487720 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" Apr 24 14:45:34.487745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487736 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="storage-initializer" Apr 24 14:45:34.487745 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487741 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="storage-initializer" Apr 24 14:45:34.487919 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487790 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e3fecf5-c573-4500-8493-843d4230c9b1" containerName="storage-initializer" Apr 24 14:45:34.487919 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487802 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed1b3fd-1a1c-4ab8-a44d-bd6efbf52a29" containerName="kserve-container" Apr 24 14:45:34.487919 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487862 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e3fecf5-c573-4500-8493-843d4230c9b1" containerName="storage-initializer" Apr 24 14:45:34.487919 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487868 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e3fecf5-c573-4500-8493-843d4230c9b1" containerName="storage-initializer" Apr 24 14:45:34.488057 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.487935 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e3fecf5-c573-4500-8493-843d4230c9b1" containerName="storage-initializer" Apr 24 14:45:34.492310 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.492286 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" Apr 24 14:45:34.494364 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.494338 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qcgw2\"" Apr 24 14:45:34.500142 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.500112 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6"] Apr 24 14:45:34.599001 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.598957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e8090e-8f94-420c-bb45-52d442cf00d0-kserve-provision-location\") pod \"raw-sklearn-de5d9-predictor-5945d675c9-qn4w6\" (UID: \"a5e8090e-8f94-420c-bb45-52d442cf00d0\") " pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" Apr 24 14:45:34.699662 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.699559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e8090e-8f94-420c-bb45-52d442cf00d0-kserve-provision-location\") pod \"raw-sklearn-de5d9-predictor-5945d675c9-qn4w6\" (UID: \"a5e8090e-8f94-420c-bb45-52d442cf00d0\") " pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" Apr 24 14:45:34.699972 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.699953 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e8090e-8f94-420c-bb45-52d442cf00d0-kserve-provision-location\") pod \"raw-sklearn-de5d9-predictor-5945d675c9-qn4w6\" (UID: \"a5e8090e-8f94-420c-bb45-52d442cf00d0\") " pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" Apr 24 14:45:34.803845 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.803790 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" Apr 24 14:45:34.919777 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.919610 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6"] Apr 24 14:45:34.922328 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:45:34.922298 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5e8090e_8f94_420c_bb45_52d442cf00d0.slice/crio-ada18fcf4f490dddebfbef3d6d356673ce4a35edf541c2d3d9722547de9f0df5 WatchSource:0}: Error finding container ada18fcf4f490dddebfbef3d6d356673ce4a35edf541c2d3d9722547de9f0df5: Status 404 returned error can't find the container with id ada18fcf4f490dddebfbef3d6d356673ce4a35edf541c2d3d9722547de9f0df5 Apr 24 14:45:34.953033 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:34.952974 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" event={"ID":"a5e8090e-8f94-420c-bb45-52d442cf00d0","Type":"ContainerStarted","Data":"ada18fcf4f490dddebfbef3d6d356673ce4a35edf541c2d3d9722547de9f0df5"} Apr 24 14:45:35.957919 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:35.957869 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" event={"ID":"a5e8090e-8f94-420c-bb45-52d442cf00d0","Type":"ContainerStarted","Data":"8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846"} Apr 24 14:45:36.194999 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.194975 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k_a3da90f3-7d28-4390-8bac-c36d5ff927c3/storage-initializer/1.log" Apr 24 14:45:36.195385 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.195368 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k_a3da90f3-7d28-4390-8bac-c36d5ff927c3/storage-initializer/0.log" Apr 24 14:45:36.195442 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.195429 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:36.314221 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.314191 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3da90f3-7d28-4390-8bac-c36d5ff927c3-kserve-provision-location\") pod \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\" (UID: \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\") " Apr 24 14:45:36.314406 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.314245 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3da90f3-7d28-4390-8bac-c36d5ff927c3-cabundle-cert\") pod \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\" (UID: \"a3da90f3-7d28-4390-8bac-c36d5ff927c3\") " Apr 24 14:45:36.314551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.314520 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3da90f3-7d28-4390-8bac-c36d5ff927c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a3da90f3-7d28-4390-8bac-c36d5ff927c3" (UID: "a3da90f3-7d28-4390-8bac-c36d5ff927c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:45:36.314675 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.314649 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3da90f3-7d28-4390-8bac-c36d5ff927c3-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a3da90f3-7d28-4390-8bac-c36d5ff927c3" (UID: "a3da90f3-7d28-4390-8bac-c36d5ff927c3"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:45:36.314826 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.314693 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3da90f3-7d28-4390-8bac-c36d5ff927c3-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:45:36.415634 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.415597 2574 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3da90f3-7d28-4390-8bac-c36d5ff927c3-cabundle-cert\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:45:36.962428 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.962399 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k_a3da90f3-7d28-4390-8bac-c36d5ff927c3/storage-initializer/1.log" Apr 24 14:45:36.962841 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.962736 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k_a3da90f3-7d28-4390-8bac-c36d5ff927c3/storage-initializer/0.log" Apr 24 14:45:36.962841 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.962768 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" containerID="666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f" exitCode=1 Apr 24 14:45:36.962977 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.962841 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" Apr 24 14:45:36.962977 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.962870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" event={"ID":"a3da90f3-7d28-4390-8bac-c36d5ff927c3","Type":"ContainerDied","Data":"666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f"} Apr 24 14:45:36.962977 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.962933 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k" event={"ID":"a3da90f3-7d28-4390-8bac-c36d5ff927c3","Type":"ContainerDied","Data":"efe561a8473a8f9177aabb994d81d092ddc298f892f7dfcb4aeafc0780533da2"} Apr 24 14:45:36.962977 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.962960 2574 scope.go:117] "RemoveContainer" containerID="666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f" Apr 24 14:45:36.971205 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.971184 2574 scope.go:117] "RemoveContainer" containerID="9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d" Apr 24 14:45:36.980697 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.980674 2574 scope.go:117] "RemoveContainer" containerID="666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f" Apr 24 14:45:36.981007 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:45:36.980980 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f\": container with ID starting with 666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f not found: ID does not exist" containerID="666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f" Apr 24 14:45:36.981092 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.981027 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f"} err="failed to get container status \"666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f\": rpc error: code = NotFound desc = could not find container \"666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f\": container with ID starting with 666f2606f01f2fbc8772566b6de174c91652c26029541c47b79eb19c1209443f not found: ID does not exist" Apr 24 14:45:36.981092 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.981053 2574 scope.go:117] "RemoveContainer" containerID="9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d" Apr 24 14:45:36.981504 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:45:36.981476 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d\": container with ID starting with 9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d not found: ID does not exist" containerID="9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d" Apr 24 14:45:36.981593 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:36.981546 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d"} err="failed to get container status \"9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d\": rpc error: code = NotFound desc = could not find container \"9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d\": container with ID starting with 9197beeb41487da8282253ff7d51d288a58663f621696b34c8392748a071164d not found: ID does not exist" Apr 24 14:45:37.001069 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:37.001033 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k"] Apr 24 14:45:37.005563 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:37.005537 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-3988b2-predictor-6ccd758dd9-gdp2k"] Apr 24 14:45:37.169184 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:37.169149 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" path="/var/lib/kubelet/pods/a3da90f3-7d28-4390-8bac-c36d5ff927c3/volumes" Apr 24 14:45:38.970735 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:38.970649 2574 generic.go:358] "Generic (PLEG): container finished" podID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerID="8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846" exitCode=0 Apr 24 14:45:38.970735 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:38.970699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" event={"ID":"a5e8090e-8f94-420c-bb45-52d442cf00d0","Type":"ContainerDied","Data":"8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846"} Apr 24 14:45:39.975384 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:39.975353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" event={"ID":"a5e8090e-8f94-420c-bb45-52d442cf00d0","Type":"ContainerStarted","Data":"d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b"} Apr 24 14:45:39.975768 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:39.975635 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" Apr 24 14:45:39.977028 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:39.976999 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 14:45:39.992384 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:39.992328 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podStartSLOduration=5.992312229 podStartE2EDuration="5.992312229s" podCreationTimestamp="2026-04-24 14:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:45:39.990105109 +0000 UTC m=+1267.428621990" watchObservedRunningTime="2026-04-24 14:45:39.992312229 +0000 UTC m=+1267.430829090" Apr 24 14:45:40.979047 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:40.979007 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 14:45:50.979551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:45:50.979503 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 14:46:00.979141 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:00.979092 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 14:46:10.979346 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:10.979299 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 14:46:20.979769 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:20.979722 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 14:46:30.979292 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:30.979242 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 14:46:40.979550 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:40.979448 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 14:46:45.170231 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:45.170199 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" Apr 24 14:46:54.604621 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.604570 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6"] Apr 24 14:46:54.605113 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.604916 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" containerID="cri-o://d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b" gracePeriod=30 Apr 24 14:46:54.673840 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.673796 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658"] Apr 24 14:46:54.674131 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.674117 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" containerName="storage-initializer" Apr 24 14:46:54.674177 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.674133 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" containerName="storage-initializer" Apr 24 14:46:54.674177 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.674144 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" containerName="storage-initializer" Apr 24 14:46:54.674177 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.674150 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" containerName="storage-initializer" Apr 24 14:46:54.674266 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.674196 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" containerName="storage-initializer" Apr 24 14:46:54.674266 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.674205 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3da90f3-7d28-4390-8bac-c36d5ff927c3" containerName="storage-initializer" Apr 24 14:46:54.677099 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.677076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" Apr 24 14:46:54.685142 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.685121 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658"] Apr 24 14:46:54.822043 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.822004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d665915-0b4d-4cf2-b448-dba8cdcbd553-kserve-provision-location\") pod \"raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658\" (UID: \"1d665915-0b4d-4cf2-b448-dba8cdcbd553\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" Apr 24 14:46:54.922815 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.922724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d665915-0b4d-4cf2-b448-dba8cdcbd553-kserve-provision-location\") pod \"raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658\" (UID: \"1d665915-0b4d-4cf2-b448-dba8cdcbd553\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" Apr 24 14:46:54.923143 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.923121 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d665915-0b4d-4cf2-b448-dba8cdcbd553-kserve-provision-location\") pod \"raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658\" (UID: \"1d665915-0b4d-4cf2-b448-dba8cdcbd553\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" Apr 24 14:46:54.987930 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:54.987855 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" Apr 24 14:46:55.108350 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:55.108316 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658"] Apr 24 14:46:55.111905 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:46:55.111858 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d665915_0b4d_4cf2_b448_dba8cdcbd553.slice/crio-455cf9646012fa6008472bd3644577041636fa440bc7c078c78f7084a0ae7383 WatchSource:0}: Error finding container 455cf9646012fa6008472bd3644577041636fa440bc7c078c78f7084a0ae7383: Status 404 returned error can't find the container with id 455cf9646012fa6008472bd3644577041636fa440bc7c078c78f7084a0ae7383 Apr 24 14:46:55.166210 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:55.166144 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 14:46:55.191469 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:55.191381 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" event={"ID":"1d665915-0b4d-4cf2-b448-dba8cdcbd553","Type":"ContainerStarted","Data":"c907f3527fad5a4e34152b8be26e253a96e4cbe85c00e7c67bd22a31f4861903"} Apr 24 14:46:55.191469 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:55.191420 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" event={"ID":"1d665915-0b4d-4cf2-b448-dba8cdcbd553","Type":"ContainerStarted","Data":"455cf9646012fa6008472bd3644577041636fa440bc7c078c78f7084a0ae7383"} Apr 24 14:46:59.203638 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:59.203602 2574 generic.go:358] "Generic (PLEG): container finished" podID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerID="c907f3527fad5a4e34152b8be26e253a96e4cbe85c00e7c67bd22a31f4861903" exitCode=0 Apr 24 14:46:59.204068 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:59.203656 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" event={"ID":"1d665915-0b4d-4cf2-b448-dba8cdcbd553","Type":"ContainerDied","Data":"c907f3527fad5a4e34152b8be26e253a96e4cbe85c00e7c67bd22a31f4861903"} Apr 24 14:46:59.350317 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:59.350293 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" Apr 24 14:46:59.458850 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:59.458751 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e8090e-8f94-420c-bb45-52d442cf00d0-kserve-provision-location\") pod \"a5e8090e-8f94-420c-bb45-52d442cf00d0\" (UID: \"a5e8090e-8f94-420c-bb45-52d442cf00d0\") " Apr 24 14:46:59.459092 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:59.459070 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e8090e-8f94-420c-bb45-52d442cf00d0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a5e8090e-8f94-420c-bb45-52d442cf00d0" (UID: "a5e8090e-8f94-420c-bb45-52d442cf00d0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:46:59.559712 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:46:59.559666 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e8090e-8f94-420c-bb45-52d442cf00d0-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:47:00.208216 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.208181 2574 generic.go:358] "Generic (PLEG): container finished" podID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerID="d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b" exitCode=0 Apr 24 14:47:00.208793 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.208256 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" Apr 24 14:47:00.208793 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.208265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" event={"ID":"a5e8090e-8f94-420c-bb45-52d442cf00d0","Type":"ContainerDied","Data":"d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b"} Apr 24 14:47:00.208793 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.208302 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6" event={"ID":"a5e8090e-8f94-420c-bb45-52d442cf00d0","Type":"ContainerDied","Data":"ada18fcf4f490dddebfbef3d6d356673ce4a35edf541c2d3d9722547de9f0df5"} Apr 24 14:47:00.208793 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.208318 2574 scope.go:117] "RemoveContainer" containerID="d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b" Apr 24 14:47:00.209916 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.209898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" event={"ID":"1d665915-0b4d-4cf2-b448-dba8cdcbd553","Type":"ContainerStarted","Data":"0e720082c9e791da1c522857c75d2757d58b1a89d8916a015bac19f78fb0e29e"} Apr 24 14:47:00.210177 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.210162 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" Apr 24 14:47:00.211595 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.211560 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 14:47:00.215958 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.215935 2574 scope.go:117] "RemoveContainer" containerID="8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846" Apr 24 14:47:00.222860 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.222842 2574 scope.go:117] "RemoveContainer" containerID="d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b" Apr 24 14:47:00.223195 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:47:00.223175 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b\": container with ID starting with d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b not found: ID does not exist" containerID="d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b" Apr 24 14:47:00.223244 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.223204 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b"} err="failed to get container status \"d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b\": rpc error: code = NotFound desc = could not find container \"d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b\": container with ID starting with d76050d3158f8a17ee7cfb9931486e233ee9ef36befc2a631aea801a0e0eed0b not found: ID does not exist" Apr 24 14:47:00.223244 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.223223 2574 scope.go:117] "RemoveContainer" containerID="8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846" Apr 24 14:47:00.223448 ip-10-0-143-92 kubenswrapper[2574]: E0424 14:47:00.223431 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846\": container with ID starting with 8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846 not found: ID does not exist" containerID="8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846" Apr 24 14:47:00.223489 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.223454 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846"} err="failed to get container status \"8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846\": rpc error: code = NotFound desc = could not find container \"8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846\": container with ID starting with 8e622b29c1c5f4c31e5b25a9a4355a188a4c75fc2dab73dbdd2aa1466c4af846 not found: ID does not exist" Apr 24 14:47:00.234237 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.234194 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podStartSLOduration=6.23418342 podStartE2EDuration="6.23418342s" podCreationTimestamp="2026-04-24 14:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:47:00.232257855 +0000 UTC m=+1347.670774714" watchObservedRunningTime="2026-04-24 14:47:00.23418342 +0000 UTC m=+1347.672700279" Apr 24 14:47:00.246776 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.246743 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6"] Apr 24 14:47:00.251945 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:00.251920 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-de5d9-predictor-5945d675c9-qn4w6"] Apr 24 14:47:01.169952 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:01.169910 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" path="/var/lib/kubelet/pods/a5e8090e-8f94-420c-bb45-52d442cf00d0/volumes" Apr 24 14:47:01.214823 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:01.214781 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 14:47:11.215596 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:11.215547 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 14:47:21.215639 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:21.215595 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 14:47:31.214810 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:31.214757 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 14:47:41.215368 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:41.215318 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 14:47:51.215542 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:47:51.215500 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 14:48:01.215174 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:01.215129 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 14:48:11.216839 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:11.216757 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" Apr 24 14:48:14.799602 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:14.799555 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658"] Apr 24 14:48:14.800076 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:14.799934 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" containerID="cri-o://0e720082c9e791da1c522857c75d2757d58b1a89d8916a015bac19f78fb0e29e" gracePeriod=30 Apr 24 14:48:19.438575 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:19.438545 2574 generic.go:358] "Generic (PLEG): container finished" podID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerID="0e720082c9e791da1c522857c75d2757d58b1a89d8916a015bac19f78fb0e29e" exitCode=0 Apr 24 14:48:19.438951 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:19.438607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" event={"ID":"1d665915-0b4d-4cf2-b448-dba8cdcbd553","Type":"ContainerDied","Data":"0e720082c9e791da1c522857c75d2757d58b1a89d8916a015bac19f78fb0e29e"} Apr 24 14:48:19.438951 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:19.438640 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" event={"ID":"1d665915-0b4d-4cf2-b448-dba8cdcbd553","Type":"ContainerDied","Data":"455cf9646012fa6008472bd3644577041636fa440bc7c078c78f7084a0ae7383"} Apr 24 14:48:19.438951 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:19.438662 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="455cf9646012fa6008472bd3644577041636fa440bc7c078c78f7084a0ae7383" Apr 24 14:48:19.439212 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:19.439197 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" Apr 24 14:48:19.603789 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:19.603753 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d665915-0b4d-4cf2-b448-dba8cdcbd553-kserve-provision-location\") pod \"1d665915-0b4d-4cf2-b448-dba8cdcbd553\" (UID: \"1d665915-0b4d-4cf2-b448-dba8cdcbd553\") " Apr 24 14:48:19.604124 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:19.604096 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d665915-0b4d-4cf2-b448-dba8cdcbd553-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1d665915-0b4d-4cf2-b448-dba8cdcbd553" (UID: "1d665915-0b4d-4cf2-b448-dba8cdcbd553"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:48:19.704646 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:19.704612 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d665915-0b4d-4cf2-b448-dba8cdcbd553-kserve-provision-location\") on node \"ip-10-0-143-92.ec2.internal\" DevicePath \"\"" Apr 24 14:48:20.441526 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:20.441492 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658" Apr 24 14:48:20.462746 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:20.462717 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658"] Apr 24 14:48:20.467310 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:20.467286 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a818b-predictor-5fcf798447-sj658"] Apr 24 14:48:21.170453 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:21.170414 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" path="/var/lib/kubelet/pods/1d665915-0b4d-4cf2-b448-dba8cdcbd553/volumes" Apr 24 14:48:43.479383 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:43.479357 2574 ???:1] "http: TLS handshake error from 10.0.137.95:40328: EOF" Apr 24 14:48:43.482741 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:43.482717 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hpfkc_e644e633-8a12-412b-b1a2-812f9fe784ed/global-pull-secret-syncer/0.log" Apr 24 14:48:43.659086 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:43.659055 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hpqs6_cfa67663-2a88-4a6b-8d07-8d08e626c4f4/konnectivity-agent/0.log" Apr 24 14:48:43.812381 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:43.812347 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-92.ec2.internal_d4713dfd221608fbf94fe833ef06f7d0/haproxy/0.log" Apr 24 14:48:47.465778 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:47.465744 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7fcdb4cbc7-jhkk7_0a95f466-fd08-40e3-9338-b67203aa3373/metrics-server/0.log" Apr 24 14:48:47.531207 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:47.531177 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p8d6x_3dcd29a6-adb7-427f-be07-62c84201acc0/node-exporter/0.log" Apr 24 14:48:47.556487 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:47.556463 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p8d6x_3dcd29a6-adb7-427f-be07-62c84201acc0/kube-rbac-proxy/0.log" Apr 24 14:48:47.585662 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:47.585632 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p8d6x_3dcd29a6-adb7-427f-be07-62c84201acc0/init-textfile/0.log" Apr 24 14:48:48.168136 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:48.168104 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-vbs42_be9905d6-7fac-4542-8b90-4622fea9cffa/prometheus-operator-admission-webhook/0.log" Apr 24 14:48:48.351045 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:48.351016 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c496764cd-nnbcv_cd322932-bc7a-4baa-b093-8310b1da5e93/thanos-query/0.log" Apr 24 14:48:48.382392 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:48.382367 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c496764cd-nnbcv_cd322932-bc7a-4baa-b093-8310b1da5e93/kube-rbac-proxy-web/0.log" Apr 24 14:48:48.413388 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:48.413360 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c496764cd-nnbcv_cd322932-bc7a-4baa-b093-8310b1da5e93/kube-rbac-proxy/0.log" Apr 24 14:48:48.444735 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:48.444666 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c496764cd-nnbcv_cd322932-bc7a-4baa-b093-8310b1da5e93/prom-label-proxy/0.log" Apr 24 14:48:48.474182 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:48.474151 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c496764cd-nnbcv_cd322932-bc7a-4baa-b093-8310b1da5e93/kube-rbac-proxy-rules/0.log" Apr 24 14:48:48.510720 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:48.510690 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c496764cd-nnbcv_cd322932-bc7a-4baa-b093-8310b1da5e93/kube-rbac-proxy-metrics/0.log" Apr 24 14:48:50.357128 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.357098 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dbd598d65-vqcxc_4747dcd2-3b22-48a3-8276-3b1b093a007e/console/0.log" Apr 24 14:48:50.571575 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571538 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl"] Apr 24 14:48:50.571854 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571841 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="storage-initializer" Apr 24 14:48:50.571937 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571855 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="storage-initializer" Apr 24 14:48:50.571937 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571871 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="storage-initializer" Apr 24 14:48:50.571937 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571891 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="storage-initializer" Apr 24 14:48:50.571937 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571898 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" Apr 24 14:48:50.571937 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571904 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" Apr 24 14:48:50.571937 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571910 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" Apr 24 14:48:50.571937 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571915 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" Apr 24 14:48:50.572171 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571966 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5e8090e-8f94-420c-bb45-52d442cf00d0" containerName="kserve-container" Apr 24 14:48:50.572171 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.571976 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d665915-0b4d-4cf2-b448-dba8cdcbd553" containerName="kserve-container" Apr 24 14:48:50.574895 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.574866 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.576802 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.576783 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7d7dd\"/\"kube-root-ca.crt\"" Apr 24 14:48:50.576802 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.576796 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7d7dd\"/\"openshift-service-ca.crt\"" Apr 24 14:48:50.577286 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.577272 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7d7dd\"/\"default-dockercfg-64fdx\"" Apr 24 14:48:50.584183 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.584157 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl"] Apr 24 14:48:50.631001 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.630904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-sys\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.631001 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.630954 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgk9g\" (UniqueName: \"kubernetes.io/projected/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-kube-api-access-wgk9g\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.631001 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.631002 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-podres\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.631240 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.631075 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-proc\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.631240 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.631090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-lib-modules\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.732355 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.732314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-podres\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.732551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.732395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-proc\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.732551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.732422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-lib-modules\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.732551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.732458 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-sys\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.732551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.732485 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgk9g\" (UniqueName: \"kubernetes.io/projected/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-kube-api-access-wgk9g\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.732551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.732493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-podres\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.732551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.732520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-proc\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.732551 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.732540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-sys\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.732970 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.732567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-lib-modules\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.743308 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.743282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgk9g\" (UniqueName: \"kubernetes.io/projected/4ca6c27d-388e-47b8-856e-bdea7b26a1a1-kube-api-access-wgk9g\") pod \"perf-node-gather-daemonset-5ztsl\" (UID: \"4ca6c27d-388e-47b8-856e-bdea7b26a1a1\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:50.885512 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:50.885409 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:51.013124 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:51.013085 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl"] Apr 24 14:48:51.016441 ip-10-0-143-92 kubenswrapper[2574]: W0424 14:48:51.016412 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4ca6c27d_388e_47b8_856e_bdea7b26a1a1.slice/crio-b62b93907d71cd23f7cf6d0863916e860bf9fee5013ea1ed357274dd3b00147a WatchSource:0}: Error finding container b62b93907d71cd23f7cf6d0863916e860bf9fee5013ea1ed357274dd3b00147a: Status 404 returned error can't find the container with id b62b93907d71cd23f7cf6d0863916e860bf9fee5013ea1ed357274dd3b00147a Apr 24 14:48:51.018164 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:51.018146 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:48:51.526678 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:51.526640 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" event={"ID":"4ca6c27d-388e-47b8-856e-bdea7b26a1a1","Type":"ContainerStarted","Data":"bbfa155b076da547e38ecb267869d58a885db494824116debd8bbcda06243714"} Apr 24 14:48:51.526678 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:51.526679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" event={"ID":"4ca6c27d-388e-47b8-856e-bdea7b26a1a1","Type":"ContainerStarted","Data":"b62b93907d71cd23f7cf6d0863916e860bf9fee5013ea1ed357274dd3b00147a"} Apr 24 14:48:51.527142 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:51.526703 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:48:51.542042 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:51.541996 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" podStartSLOduration=1.541981082 podStartE2EDuration="1.541981082s" podCreationTimestamp="2026-04-24 14:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:48:51.540563506 +0000 UTC m=+1458.979080370" watchObservedRunningTime="2026-04-24 14:48:51.541981082 +0000 UTC m=+1458.980497942" Apr 24 14:48:51.744416 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:51.744387 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wvb5j_5625c3db-794f-4f6c-993d-1477fc0a38b8/dns/0.log" Apr 24 14:48:51.769287 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:51.769252 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wvb5j_5625c3db-794f-4f6c-993d-1477fc0a38b8/kube-rbac-proxy/0.log" Apr 24 14:48:51.825721 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:51.825690 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dkjpq_267d3672-b74b-4241-80b9-1467f130ddd8/dns-node-resolver/0.log" Apr 24 14:48:52.319357 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:52.319327 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6xfh9_96a778f3-6185-45b9-803d-403e973b65b9/node-ca/0.log" Apr 24 14:48:53.577976 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:53.577944 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t5b97_3b39242d-971a-47cb-9943-42846bc6d8b6/serve-healthcheck-canary/0.log" Apr 24 14:48:54.118244 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:54.118212 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q2c42_0a903af0-61c5-40c5-b41e-988f215668f9/kube-rbac-proxy/0.log" Apr 24 14:48:54.143717 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:54.143691 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q2c42_0a903af0-61c5-40c5-b41e-988f215668f9/exporter/0.log" Apr 24 14:48:54.169602 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:54.169568 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q2c42_0a903af0-61c5-40c5-b41e-988f215668f9/extractor/0.log" Apr 24 14:48:56.185319 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:56.185286 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-b7dc77d59-qp9rk_4c49537d-7a5d-4934-9cc6-7f1853e0a401/manager/0.log" Apr 24 14:48:56.510009 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:56.509931 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-t7lp5_a5ec2e7d-4ab3-4b53-95f2-95e438354d70/seaweedfs/0.log" Apr 24 14:48:57.539571 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:48:57.539540 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-5ztsl" Apr 24 14:49:02.552030 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.551998 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqgmq_dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef/kube-multus-additional-cni-plugins/0.log" Apr 24 14:49:02.579409 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.579376 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqgmq_dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef/egress-router-binary-copy/0.log" Apr 24 14:49:02.604814 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.604789 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqgmq_dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef/cni-plugins/0.log" Apr 24 14:49:02.630962 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.630908 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqgmq_dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef/bond-cni-plugin/0.log" Apr 24 14:49:02.656782 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.656755 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqgmq_dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef/routeoverride-cni/0.log" Apr 24 14:49:02.682189 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.682161 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqgmq_dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef/whereabouts-cni-bincopy/0.log" Apr 24 14:49:02.708364 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.708336 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqgmq_dbfdd18f-56c4-4ff4-9933-f7b18e2d71ef/whereabouts-cni/0.log" Apr 24 14:49:02.811760 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.811734 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zzdsp_afe77287-0ba9-4aaf-865a-6dc077e51a54/kube-multus/0.log" Apr 24 14:49:02.958389 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.958355 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ncpbf_bee39eb9-c473-4f55-a88c-427f97349f6c/network-metrics-daemon/0.log" Apr 24 14:49:02.982518 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:02.982486 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ncpbf_bee39eb9-c473-4f55-a88c-427f97349f6c/kube-rbac-proxy/0.log" Apr 24 14:49:04.198602 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:04.198567 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb7mf_01aa8393-c538-4d6b-a611-6016be7c4a85/ovn-controller/0.log" Apr 24 14:49:04.233771 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:04.233741 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb7mf_01aa8393-c538-4d6b-a611-6016be7c4a85/ovn-acl-logging/0.log" Apr 24 14:49:04.260930 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:04.260875 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb7mf_01aa8393-c538-4d6b-a611-6016be7c4a85/kube-rbac-proxy-node/0.log" Apr 24 14:49:04.286998 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:04.286951 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb7mf_01aa8393-c538-4d6b-a611-6016be7c4a85/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 14:49:04.308068 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:04.308036 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb7mf_01aa8393-c538-4d6b-a611-6016be7c4a85/northd/0.log" Apr 24 14:49:04.334512 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:04.334469 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb7mf_01aa8393-c538-4d6b-a611-6016be7c4a85/nbdb/0.log" Apr 24 14:49:04.362352 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:04.362318 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb7mf_01aa8393-c538-4d6b-a611-6016be7c4a85/sbdb/0.log" Apr 24 14:49:04.530500 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:04.530410 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hb7mf_01aa8393-c538-4d6b-a611-6016be7c4a85/ovnkube-controller/0.log" Apr 24 14:49:06.084276 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:06.084243 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-c5dxc_27ab69b2-98a6-4382-8921-0b4c9485c514/network-check-target-container/0.log" Apr 24 14:49:07.171080 ip-10-0-143-92 kubenswrapper[2574]: I0424 14:49:07.171050 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-h8gj8_7b326316-4b11-4b19-9e26-40d9a7795d9b/iptables-alerter/0.log"