Apr 23 13:29:48.572105 ip-10-0-136-158 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 13:29:48.572122 ip-10-0-136-158 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 13:29:48.572132 ip-10-0-136-158 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 13:29:48.572469 ip-10-0-136-158 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 13:29:58.647779 ip-10-0-136-158 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 13:29:58.647793 ip-10-0-136-158 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 16f0944d001342bdb769ce6f069c3c8f -- Apr 23 13:32:25.723404 ip-10-0-136-158 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:32:26.158731 ip-10-0-136-158 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:26.158731 ip-10-0-136-158 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:32:26.158731 ip-10-0-136-158 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:26.158731 ip-10-0-136-158 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:32:26.158731 ip-10-0-136-158 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:26.161989 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.161888 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:32:26.167233 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167206 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:26.167233 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167230 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:26.167233 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167234 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:26.167233 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167237 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:26.167233 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167240 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:26.167233 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167243 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167246 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167249 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167252 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167254 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167257 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167260 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167263 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167265 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167268 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167271 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167274 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167277 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167279 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167281 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167284 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167287 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167289 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167292 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167295 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:26.167468 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167298 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167301 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167304 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167306 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167309 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167311 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167313 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167316 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167318 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167321 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167323 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167326 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167328 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167331 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167334 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167338 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167340 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167343 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167346 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:26.167938 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167350 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167355 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167358 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167361 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167364 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167367 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167370 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167372 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167376 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167379 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167381 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167384 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167387 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167390 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167392 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167395 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167398 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167401 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167404 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167407 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:26.168458 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167410 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167412 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167415 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167418 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167423 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167427 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167430 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167433 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167436 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167439 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167441 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167444 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167447 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167449 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167453 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167455 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167458 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167461 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167463 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:26.168933 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167466 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167468 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167471 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167869 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167874 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167877 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167879 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167882 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167885 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167888 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167891 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167893 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167896 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167899 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167901 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167904 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167906 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167909 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167911 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167914 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:26.169414 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167919 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167921 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167924 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167927 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167929 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167932 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167934 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167937 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167940 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167943 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167945 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167948 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167951 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167953 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167956 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167958 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167961 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167963 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167966 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:26.169890 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167968 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167972 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167975 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167979 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167982 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167985 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167987 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167990 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167992 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167995 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.167997 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168000 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168004 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168007 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168010 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168012 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168015 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168017 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168020 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:26.170456 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168022 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168026 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168030 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168033 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168036 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168038 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168041 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168043 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168046 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168049 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168051 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168073 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168078 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168081 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168084 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168086 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168089 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168092 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168094 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:26.170924 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168097 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168099 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168102 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168104 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168107 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168109 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168112 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168115 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168118 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168121 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168123 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.168126 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168915 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168925 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168932 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168937 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168942 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168946 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168951 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168956 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168959 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:32:26.171409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168962 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168966 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168969 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168973 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168976 2569 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168978 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168981 2569 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168985 2569 flags.go:64] FLAG: --cloud-config="" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168988 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.168990 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169011 2569 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169015 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169019 2569 flags.go:64] FLAG: --config-dir="" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169022 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169026 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169030 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169033 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169036 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169040 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169043 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169046 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169049 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169052 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169073 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169079 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:32:26.171921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169082 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169085 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169088 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169092 2569 flags.go:64] FLAG: --enable-server="true" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169095 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169099 2569 flags.go:64] FLAG: --event-burst="100" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169102 2569 flags.go:64] FLAG: --event-qps="50" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169105 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169108 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169111 2569 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169115 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169118 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169121 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169124 2569 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169127 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169130 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169132 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169138 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169142 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169145 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169148 2569 flags.go:64] FLAG: --feature-gates="" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169152 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169155 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169158 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169161 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169165 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:32:26.173260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169168 2569 flags.go:64] FLAG: --help="false" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169170 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169173 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169176 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169180 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169183 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169187 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169190 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169193 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169196 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169199 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169203 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169206 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169209 2569 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169212 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169214 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169217 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169220 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169223 2569 flags.go:64] FLAG: --lock-file="" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169226 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169229 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169232 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169237 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:32:26.173882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169242 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169245 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169248 2569 flags.go:64] FLAG: --logging-format="text" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169251 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169254 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169257 2569 flags.go:64] FLAG: --manifest-url="" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169260 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169264 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169267 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169274 2569 flags.go:64] FLAG: --max-pods="110" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169277 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169280 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169283 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169286 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169289 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169292 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169295 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169304 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169307 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169310 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169315 2569 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169318 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169324 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169327 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:32:26.174488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169330 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169333 2569 flags.go:64] FLAG: --port="10250" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169336 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169338 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-031cf345c7cd49e78" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169342 2569 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169344 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169347 2569 flags.go:64] FLAG: --register-node="true" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169350 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169354 2569 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169361 2569 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169364 2569 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169367 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169370 2569 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169374 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169377 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169380 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169383 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169386 2569 flags.go:64] FLAG: --runonce="false" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169389 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169392 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169395 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169398 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169400 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169403 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169407 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169410 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:32:26.175079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169412 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169415 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169418 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169422 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169425 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169429 2569 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169432 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169437 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169440 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169443 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169448 2569 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169450 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169453 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169456 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169460 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169463 2569 flags.go:64] FLAG: --v="2" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169468 2569 flags.go:64] FLAG: --version="false" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169472 2569 flags.go:64] FLAG: --vmodule="" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169476 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.169479 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169579 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169583 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169586 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169589 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:26.175698 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169592 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169594 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169597 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169599 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169602 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169604 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169607 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169609 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169612 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169615 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169617 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169619 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169622 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169625 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169627 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169630 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169632 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169634 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169637 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169639 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:26.176292 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169642 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169645 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169647 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169650 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169653 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169655 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169658 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169661 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169663 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169666 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169668 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169671 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169674 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169676 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169679 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169681 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169684 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169687 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169691 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169694 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:26.176786 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169696 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169699 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169701 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169704 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169706 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169709 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169712 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169714 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169717 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169721 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169724 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169727 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169730 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169733 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169735 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169738 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169741 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169743 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169746 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:26.177298 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169749 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169752 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169754 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169757 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169759 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169762 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169764 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169767 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169769 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169772 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169774 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169777 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169779 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169782 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169784 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169787 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169789 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169791 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169794 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169797 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:26.177801 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169800 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169802 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.169805 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.170710 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.177400 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.177420 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177470 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177476 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177479 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177483 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177486 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177490 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177493 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177495 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177498 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:26.178308 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177501 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177505 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177510 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177514 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177517 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177519 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177522 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177525 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177527 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177530 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177534 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177536 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177539 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177542 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177544 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177547 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177557 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177560 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177563 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:26.178687 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177567 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177571 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177574 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177577 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177580 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177582 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177585 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177587 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177590 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177593 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177595 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177598 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177600 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177603 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177606 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177608 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177611 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177614 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177616 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177619 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:26.179186 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177621 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177624 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177626 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177628 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177631 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177633 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177636 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177638 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177641 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177645 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177648 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177651 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177654 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177656 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177659 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177661 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177664 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177666 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177669 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177671 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:26.179684 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177673 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177676 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177679 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177681 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177684 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177686 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177688 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177691 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177693 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177696 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177698 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177701 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177703 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177706 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177708 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177711 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177714 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:26.180226 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177716 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.177722 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177816 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177820 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177823 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177826 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177830 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177833 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177836 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177838 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177841 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177844 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177846 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177849 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177851 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:26.180645 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177854 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177857 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177859 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177862 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177864 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177867 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177869 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177871 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177874 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177877 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177879 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177883 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177886 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177889 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177892 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177894 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177897 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177899 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177903 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:26.181102 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177906 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177909 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177911 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177915 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177918 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177922 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177924 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177927 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177930 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177932 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177935 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177937 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177940 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177942 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177945 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177947 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177950 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177952 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177955 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177957 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:26.181562 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177959 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177962 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177964 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177967 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177970 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177972 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177974 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177977 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177979 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177981 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177984 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177986 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177988 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177991 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177993 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177996 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.177999 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178002 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178005 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178016 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:26.182084 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178019 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178022 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178025 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178027 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178030 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178032 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178035 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178037 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178040 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178042 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178045 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178047 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178050 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:26.178053 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.178077 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:26.182569 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.178802 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:32:26.182963 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.180958 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:32:26.182963 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.181911 2569 server.go:1019] "Starting client certificate rotation" Apr 23 13:32:26.182963 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.182020 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:26.182963 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.182083 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:26.209599 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.209575 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:26.212956 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.212938 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:26.227514 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.227487 2569 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:32:26.233313 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.233289 2569 log.go:25] "Validated CRI v1 image API" Apr 23 13:32:26.235981 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.235960 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:32:26.238329 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.238312 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:26.238979 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.238956 2569 fs.go:135] Filesystem UUIDs: map[1e770ddb-2976-4873-a5e6-5363d83c07be:/dev/nvme0n1p4 58ba0e42-9bf1-4a0f-a69a-8155b2dfddfc:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 23 13:32:26.239017 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.238981 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:32:26.244477 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.244352 2569 manager.go:217] Machine: {Timestamp:2026-04-23 13:32:26.243151137 +0000 UTC m=+0.398073596 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098805 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec252d926d62f66a4cdcdaeb0e45cfd1 SystemUUID:ec252d92-6d62-f66a-4cdc-daeb0e45cfd1 BootID:16f0944d-0013-42bd-b769-ce6f069c3c8f Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:65:1f:1d:a9:8b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:65:1f:1d:a9:8b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d6:2f:75:1e:44:3e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:32:26.244477 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.244468 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:32:26.244597 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.244557 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:32:26.247068 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.247030 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:32:26.247230 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.247073 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-158.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:32:26.247756 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.247745 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:32:26.247786 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.247759 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:32:26.247786 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.247772 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:26.248607 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.248596 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:26.249446 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.249434 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:26.249570 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.249561 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:32:26.252309 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.252294 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:32:26.252382 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.252321 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:32:26.252382 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.252338 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:32:26.252382 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.252352 2569 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:32:26.252382 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.252361 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:32:26.253514 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.253499 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:26.253560 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.253520 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:26.256708 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.256690 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:32:26.258261 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.258245 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:32:26.260111 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260095 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:32:26.260171 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260139 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:32:26.260171 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260150 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:32:26.260171 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260167 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:32:26.260246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260177 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:32:26.260246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260187 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:32:26.260246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260196 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:32:26.260246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260204 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:32:26.260246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260215 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:32:26.260246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260223 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:32:26.260517 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260505 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:32:26.260551 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.260527 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:32:26.261313 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.261290 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pp8vg" Apr 23 13:32:26.261393 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.261373 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:32:26.261393 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.261383 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:32:26.265725 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.265705 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:32:26.265855 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.265751 2569 server.go:1295] "Started kubelet" Apr 23 13:32:26.265923 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.265825 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:32:26.265958 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.265915 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:32:26.266016 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.266001 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:32:26.267253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.267224 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-158.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 13:32:26.267464 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.267440 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:32:26.267554 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.267538 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:32:26.267849 ip-10-0-136-158 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:32:26.269591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.268344 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:32:26.269866 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.269796 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:32:26.271452 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.271431 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pp8vg" Apr 23 13:32:26.274860 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.274840 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:26.275563 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.275546 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:32:26.276263 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.274888 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-158.ec2.internal.18a8ffa720643a3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-158.ec2.internal,UID:ip-10-0-136-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-158.ec2.internal,},FirstTimestamp:2026-04-23 13:32:26.265721404 +0000 UTC m=+0.420643863,LastTimestamp:2026-04-23 13:32:26.265721404 +0000 UTC m=+0.420643863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-158.ec2.internal,}" Apr 23 13:32:26.276362 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.276352 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:32:26.276413 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.276354 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:32:26.276413 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.276383 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:32:26.276506 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.276458 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:32:26.276506 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.276467 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:32:26.277944 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.277923 2569 factory.go:55] Registering systemd factory Apr 23 13:32:26.278049 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.277953 2569 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:32:26.278287 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.278272 2569 factory.go:153] Registering CRI-O factory Apr 23 13:32:26.278287 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.278288 2569 factory.go:223] Registration of the crio container factory successfully Apr 23 13:32:26.278388 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.278311 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:26.278388 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.278334 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:32:26.278388 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.278354 2569 factory.go:103] Registering Raw factory Apr 23 13:32:26.278388 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.278365 2569 manager.go:1196] Started watching for new ooms in manager Apr 23 13:32:26.278728 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.278713 2569 manager.go:319] Starting recovery of all containers Apr 23 13:32:26.278786 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.278753 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:32:26.287807 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.287412 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:26.290077 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.290043 2569 manager.go:324] Recovery completed Apr 23 13:32:26.290705 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.290682 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-158.ec2.internal\" not found" node="ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.294811 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.294798 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:26.297270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.297247 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:26.297361 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.297283 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:26.297361 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.297296 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:26.298304 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.298289 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:32:26.298304 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.298303 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:32:26.298428 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.298323 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:26.300577 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.300559 2569 policy_none.go:49] "None policy: Start" Apr 23 13:32:26.300577 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.300577 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:32:26.300731 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.300588 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:32:26.342910 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.342887 2569 manager.go:341] "Starting Device Plugin manager" Apr 23 13:32:26.353243 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.342931 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:32:26.353243 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.342945 2569 server.go:85] "Starting device plugin registration server" Apr 23 13:32:26.353243 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.343269 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:32:26.353243 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.343282 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:32:26.353243 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.343355 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:32:26.353243 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.343553 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:32:26.353243 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.343566 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:32:26.353243 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.343978 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:32:26.353243 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.344006 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:26.414192 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.414095 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:32:26.415488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.415473 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:32:26.415579 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.415500 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:32:26.415579 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.415522 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:32:26.415579 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.415528 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:32:26.415579 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.415568 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:32:26.418993 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.418973 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:26.444113 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.444076 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:26.445279 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.445261 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:26.445390 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.445294 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:26.445390 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.445308 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:26.445390 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.445340 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.453869 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.453848 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.453951 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.453878 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-158.ec2.internal\": node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:26.474033 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.474006 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:26.515802 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.515769 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal"] Apr 23 13:32:26.515875 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.515869 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:26.516867 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.516845 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:26.516988 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.516885 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:26.516988 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.516897 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:26.518195 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.518183 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:26.518357 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.518342 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.518396 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.518374 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:26.518908 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.518891 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:26.518991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.518914 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:26.518991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.518927 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:26.518991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.518967 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:26.518991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.518987 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:26.519135 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.519001 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:26.520038 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.520019 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.520136 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.520045 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:26.520670 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.520657 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:26.520730 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.520681 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:26.520730 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.520690 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:26.547116 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.547093 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-158.ec2.internal\" not found" node="ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.551625 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.551602 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-158.ec2.internal\" not found" node="ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.574359 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.574323 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:26.577618 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.577599 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.577693 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.577627 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9b4440db6557536c217fdb95da13736d-config\") pod \"kube-apiserver-proxy-ip-10-0-136-158.ec2.internal\" (UID: \"9b4440db6557536c217fdb95da13736d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.577693 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.577648 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.674841 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.674771 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:26.678116 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.678093 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.678195 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.678124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.678195 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.678143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9b4440db6557536c217fdb95da13736d-config\") pod \"kube-apiserver-proxy-ip-10-0-136-158.ec2.internal\" (UID: \"9b4440db6557536c217fdb95da13736d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.678291 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.678209 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9b4440db6557536c217fdb95da13736d-config\") pod \"kube-apiserver-proxy-ip-10-0-136-158.ec2.internal\" (UID: \"9b4440db6557536c217fdb95da13736d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.678291 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.678210 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.678291 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.678227 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.775567 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.775536 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:26.848968 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.848936 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.854652 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:26.854626 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 23 13:32:26.876324 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.876291 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:26.976921 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:26.976818 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:27.077320 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:27.077286 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:27.177777 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:27.177740 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:27.181988 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.181958 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:32:27.182146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.182125 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:27.182196 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.182137 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:27.273514 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.273476 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:27:26 +0000 UTC" deadline="2028-01-24 20:10:20.37387818 +0000 UTC" Apr 23 13:32:27.273514 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.273508 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15390h37m53.100373687s" Apr 23 13:32:27.275640 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.275618 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:27.278285 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:27.278260 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 23 13:32:27.286363 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.286333 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:27.309832 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.309804 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8v99j" Apr 23 13:32:27.318200 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.318171 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8v99j" Apr 23 13:32:27.360331 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:27.360295 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2eebab4ba0cac1e68c6bccde729de79.slice/crio-ea18e52f42e0e7f8de3dd5c10ddc8f32bbbe80dbc9828f656fba18e641af56a0 WatchSource:0}: Error finding container ea18e52f42e0e7f8de3dd5c10ddc8f32bbbe80dbc9828f656fba18e641af56a0: Status 404 returned error can't find the container with id ea18e52f42e0e7f8de3dd5c10ddc8f32bbbe80dbc9828f656fba18e641af56a0 Apr 23 13:32:27.360563 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:27.360542 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4440db6557536c217fdb95da13736d.slice/crio-401bdcc80ebe383513013c4418e1ac56979410bcc07ad168981e9e2b82317a63 WatchSource:0}: Error finding container 401bdcc80ebe383513013c4418e1ac56979410bcc07ad168981e9e2b82317a63: Status 404 returned error can't find the container with id 401bdcc80ebe383513013c4418e1ac56979410bcc07ad168981e9e2b82317a63 Apr 23 13:32:27.365146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.365120 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:27.366492 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.366476 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:32:27.376567 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.376544 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 23 13:32:27.388642 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.388617 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:27.390619 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.390601 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 23 13:32:27.397222 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.397205 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:27.419301 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.419237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" event={"ID":"c2eebab4ba0cac1e68c6bccde729de79","Type":"ContainerStarted","Data":"ea18e52f42e0e7f8de3dd5c10ddc8f32bbbe80dbc9828f656fba18e641af56a0"} Apr 23 13:32:27.420217 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.420196 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" event={"ID":"9b4440db6557536c217fdb95da13736d","Type":"ContainerStarted","Data":"401bdcc80ebe383513013c4418e1ac56979410bcc07ad168981e9e2b82317a63"} Apr 23 13:32:27.537465 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:27.537385 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:28.254236 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.254203 2569 apiserver.go:52] "Watching apiserver" Apr 23 13:32:28.264908 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.264877 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:32:28.267027 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.266987 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-cj68p","openshift-multus/network-metrics-daemon-dqcwj","openshift-cluster-node-tuning-operator/tuned-tttgj","openshift-dns/node-resolver-nzrks","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal","openshift-network-diagnostics/network-check-target-zw7vm","openshift-network-operator/iptables-alerter-9fc98","openshift-ovn-kubernetes/ovnkube-node-f6dwj","kube-system/konnectivity-agent-zjfsg","kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg","openshift-image-registry/node-ca-xwbp9","openshift-multus/multus-additional-cni-plugins-6lkhk"] Apr 23 13:32:28.269187 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.269156 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.272591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.271941 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:28.272591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.272044 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:28.272591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.272150 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.272591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.272256 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:32:28.272591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.272589 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-58r5z\"" Apr 23 13:32:28.273853 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.273537 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:28.273853 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.273631 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:28.274414 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.274392 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:28.274631 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.274609 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:28.274704 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.274634 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sr2g2\"" Apr 23 13:32:28.275157 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.274965 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.276282 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.276252 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:28.276370 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.276297 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.276370 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.276310 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:28.276990 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.276970 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:32:28.277561 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.277272 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vk77b\"" Apr 23 13:32:28.277561 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.277334 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:32:28.278436 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.278413 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:32:28.278857 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.278840 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:32:28.278943 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.278919 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:32:28.278943 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.278840 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:32:28.279101 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.279081 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:28.279319 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.279301 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8v757\"" Apr 23 13:32:28.280687 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.280583 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.280970 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.280952 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.282260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.282241 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:32:28.282260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.282258 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:32:28.282502 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.282486 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:32:28.282578 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.282516 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dswd5\"" Apr 23 13:32:28.282974 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.282943 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:32:28.283132 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.283024 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:32:28.283233 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.283215 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.283233 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.283230 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.283341 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.283313 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:32:28.283949 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.283933 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:32:28.284107 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.284087 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:32:28.284309 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.284294 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:32:28.284388 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.284362 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pcrnb\"" Apr 23 13:32:28.284564 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.284546 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8cjws\"" Apr 23 13:32:28.284675 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.284635 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:32:28.284675 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.284634 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:32:28.285222 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.285203 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:32:28.285310 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.285298 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:32:28.285516 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.285500 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:32:28.285678 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.285657 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:32:28.285779 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.285767 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zvfbj\"" Apr 23 13:32:28.285989 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.285962 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jcpkp\"" Apr 23 13:32:28.286334 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.286315 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:32:28.286936 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.286914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe336864-d139-416a-b1cb-afe14a9db883-cni-binary-copy\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.287025 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.286955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-systemd-units\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.287025 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.286984 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-node-log\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.287025 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.287004 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-systemd\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.287206 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.287099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-var-lib-kubelet\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.287206 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.287131 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-multus-conf-dir\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.288045 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288019 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46hzd\" (UniqueName: \"kubernetes.io/projected/fe336864-d139-416a-b1cb-afe14a9db883-kube-api-access-46hzd\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.288146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:28.288225 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-sys\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.288304 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288280 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-var-lib-cni-bin\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.288359 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288324 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-sysconfig\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.288359 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288352 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-kubernetes\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.288547 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288530 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-cni-netd\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.288724 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288707 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61372c64-9070-4751-b720-a4016030cf02-tmp-dir\") pod \"node-resolver-nzrks\" (UID: \"61372c64-9070-4751-b720-a4016030cf02\") " pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.288772 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288746 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-kubelet\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.288819 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-run-openvswitch\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.288869 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288824 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-multus-socket-dir-parent\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.288869 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288853 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-var-lib-cni-multus\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.288963 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288885 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:28.288963 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288915 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szxh8\" (UniqueName: \"kubernetes.io/projected/dc7a9b0c-42a9-4562-a03a-27dca913446a-kube-api-access-szxh8\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:28.288963 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288946 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1237c950-1db9-42f8-be43-fc6424f2ae2c-env-overrides\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.289121 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.288977 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-run-netns\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.289121 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289024 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe336864-d139-416a-b1cb-afe14a9db883-multus-daemon-config\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.289121 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289077 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-run-netns\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.289251 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289112 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/baa916b1-56d7-46e4-9ccb-a3794c262e34-iptables-alerter-script\") pod \"iptables-alerter-9fc98\" (UID: \"baa916b1-56d7-46e4-9ccb-a3794c262e34\") " pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.289251 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289157 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5675\" (UniqueName: \"kubernetes.io/projected/baa916b1-56d7-46e4-9ccb-a3794c262e34-kube-api-access-r5675\") pod \"iptables-alerter-9fc98\" (UID: \"baa916b1-56d7-46e4-9ccb-a3794c262e34\") " pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.289251 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289186 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-run-multus-certs\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.289251 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.289251 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289243 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1237c950-1db9-42f8-be43-fc6424f2ae2c-ovnkube-config\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.289463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289272 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-slash\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.289463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289300 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-etc-openvswitch\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.289463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289330 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/baa916b1-56d7-46e4-9ccb-a3794c262e34-host-slash\") pod \"iptables-alerter-9fc98\" (UID: \"baa916b1-56d7-46e4-9ccb-a3794c262e34\") " pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.289463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289354 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-system-cni-dir\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.289463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289390 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-run-k8s-cni-cncf-io\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.289463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1237c950-1db9-42f8-be43-fc6424f2ae2c-ovnkube-script-lib\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.289463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289451 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-multus-cni-dir\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.289742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-run\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.289742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289510 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-sysctl-d\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.289742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289539 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-tuned\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.289742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289564 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8797a06c-9f6d-4c9f-b8e1-36e99724079b-tmp\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.289742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289593 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-cnibin\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.289742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-etc-kubernetes\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.289742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289657 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-run-systemd\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.289742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-cni-bin\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.290147 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289755 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61372c64-9070-4751-b720-a4016030cf02-hosts-file\") pod \"node-resolver-nzrks\" (UID: \"61372c64-9070-4751-b720-a4016030cf02\") " pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.290147 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-run-ovn\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.290147 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.289830 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1237c950-1db9-42f8-be43-fc6424f2ae2c-ovn-node-metrics-cert\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.290147 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290016 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqrw\" (UniqueName: \"kubernetes.io/projected/1237c950-1db9-42f8-be43-fc6424f2ae2c-kube-api-access-gfqrw\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.290313 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-os-release\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.290364 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290317 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-sysctl-conf\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.290410 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290362 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-var-lib-kubelet\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.290451 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290413 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp7xz\" (UniqueName: \"kubernetes.io/projected/61372c64-9070-4751-b720-a4016030cf02-kube-api-access-bp7xz\") pod \"node-resolver-nzrks\" (UID: \"61372c64-9070-4751-b720-a4016030cf02\") " pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.290498 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290445 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-modprobe-d\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.290498 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290476 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-lib-modules\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.290591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290520 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-host\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.290591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290564 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jh5s\" (UniqueName: \"kubernetes.io/projected/8797a06c-9f6d-4c9f-b8e1-36e99724079b-kube-api-access-7jh5s\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.290678 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-hostroot\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.290678 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-var-lib-openvswitch\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.290761 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290695 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-log-socket\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.290761 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.290728 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.318830 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.318796 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:27 +0000 UTC" deadline="2028-01-30 08:02:39.127758066 +0000 UTC" Apr 23 13:32:28.318830 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.318828 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15522h30m10.808933095s" Apr 23 13:32:28.377871 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.377836 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:32:28.391323 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/baa916b1-56d7-46e4-9ccb-a3794c262e34-iptables-alerter-script\") pod \"iptables-alerter-9fc98\" (UID: \"baa916b1-56d7-46e4-9ccb-a3794c262e34\") " pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.391323 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391332 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5675\" (UniqueName: \"kubernetes.io/projected/baa916b1-56d7-46e4-9ccb-a3794c262e34-kube-api-access-r5675\") pod \"iptables-alerter-9fc98\" (UID: \"baa916b1-56d7-46e4-9ccb-a3794c262e34\") " pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.391585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391361 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-run-multus-certs\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.391585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391414 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.391585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1237c950-1db9-42f8-be43-fc6424f2ae2c-ovnkube-config\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.391585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391472 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-etc-selinux\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.391585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391469 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-run-multus-certs\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.391585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391501 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e29be9aa-ef19-4770-b277-bce09909acde-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.391585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391584 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-slash\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-etc-openvswitch\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391693 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-etc-openvswitch\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391703 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-slash\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/baa916b1-56d7-46e4-9ccb-a3794c262e34-host-slash\") pod \"iptables-alerter-9fc98\" (UID: \"baa916b1-56d7-46e4-9ccb-a3794c262e34\") " pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391777 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-system-cni-dir\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-run-k8s-cni-cncf-io\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391814 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/baa916b1-56d7-46e4-9ccb-a3794c262e34-host-slash\") pod \"iptables-alerter-9fc98\" (UID: \"baa916b1-56d7-46e4-9ccb-a3794c262e34\") " pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1237c950-1db9-42f8-be43-fc6424f2ae2c-ovnkube-script-lib\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391856 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqp96\" (UniqueName: \"kubernetes.io/projected/29bbf816-c174-4330-b3f2-ded908db0f6a-kube-api-access-zqp96\") pod \"node-ca-xwbp9\" (UID: \"29bbf816-c174-4330-b3f2-ded908db0f6a\") " pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-system-cni-dir\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391887 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-system-cni-dir\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.391909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391903 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-run-k8s-cni-cncf-io\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391938 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/baa916b1-56d7-46e4-9ccb-a3794c262e34-iptables-alerter-script\") pod \"iptables-alerter-9fc98\" (UID: \"baa916b1-56d7-46e4-9ccb-a3794c262e34\") " pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391913 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-multus-cni-dir\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391961 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-multus-cni-dir\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.391985 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29bbf816-c174-4330-b3f2-ded908db0f6a-serviceca\") pod \"node-ca-xwbp9\" (UID: \"29bbf816-c174-4330-b3f2-ded908db0f6a\") " pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-run\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392041 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-sysctl-d\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392085 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-tuned\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8797a06c-9f6d-4c9f-b8e1-36e99724079b-tmp\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392136 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-cnibin\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392087 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-run\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-etc-kubernetes\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392188 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-run-systemd\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-cnibin\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392214 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-cni-bin\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392215 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-sysctl-d\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61372c64-9070-4751-b720-a4016030cf02-hosts-file\") pod \"node-resolver-nzrks\" (UID: \"61372c64-9070-4751-b720-a4016030cf02\") " pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392282 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-run-ovn\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.392493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-etc-kubernetes\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-cni-bin\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1237c950-1db9-42f8-be43-fc6424f2ae2c-ovn-node-metrics-cert\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392324 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392338 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqrw\" (UniqueName: \"kubernetes.io/projected/1237c950-1db9-42f8-be43-fc6424f2ae2c-kube-api-access-gfqrw\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-run-ovn\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392340 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61372c64-9070-4751-b720-a4016030cf02-hosts-file\") pod \"node-resolver-nzrks\" (UID: \"61372c64-9070-4751-b720-a4016030cf02\") " pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q2dl\" (UniqueName: \"kubernetes.io/projected/741b498a-5409-4b33-8b00-977a99dc68e9-kube-api-access-4q2dl\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-run-systemd\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1237c950-1db9-42f8-be43-fc6424f2ae2c-ovnkube-config\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392399 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-os-release\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392425 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-sysctl-conf\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392448 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-var-lib-kubelet\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392453 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-os-release\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392484 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp7xz\" (UniqueName: \"kubernetes.io/projected/61372c64-9070-4751-b720-a4016030cf02-kube-api-access-bp7xz\") pod \"node-resolver-nzrks\" (UID: \"61372c64-9070-4751-b720-a4016030cf02\") " pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392497 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-var-lib-kubelet\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392388 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1237c950-1db9-42f8-be43-fc6424f2ae2c-ovnkube-script-lib\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392515 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac-agent-certs\") pod \"konnectivity-agent-zjfsg\" (UID: \"5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac\") " pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:28.393295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-os-release\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392587 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-sysctl-conf\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392614 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-modprobe-d\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392653 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-lib-modules\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-modprobe-d\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392721 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-host\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392767 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-lib-modules\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392808 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jh5s\" (UniqueName: \"kubernetes.io/projected/8797a06c-9f6d-4c9f-b8e1-36e99724079b-kube-api-access-7jh5s\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392795 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-host\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392857 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-hostroot\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392881 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-var-lib-openvswitch\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392906 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-log-socket\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392934 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392958 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-var-lib-openvswitch\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392965 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.392978 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-log-socket\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.394346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393000 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-socket-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-hostroot\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393025 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29bbf816-c174-4330-b3f2-ded908db0f6a-host\") pod \"node-ca-xwbp9\" (UID: \"29bbf816-c174-4330-b3f2-ded908db0f6a\") " pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393047 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e29be9aa-ef19-4770-b277-bce09909acde-cni-binary-copy\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393113 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe336864-d139-416a-b1cb-afe14a9db883-cni-binary-copy\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393130 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-systemd-units\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393146 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-cnibin\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-node-log\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-systemd\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-var-lib-kubelet\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393228 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-multus-conf-dir\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46hzd\" (UniqueName: \"kubernetes.io/projected/fe336864-d139-416a-b1cb-afe14a9db883-kube-api-access-46hzd\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-sys\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393319 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-var-lib-cni-bin\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-sysconfig\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.395109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-kubernetes\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-cni-netd\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393405 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-var-lib-kubelet\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393406 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61372c64-9070-4751-b720-a4016030cf02-tmp-dir\") pod \"node-resolver-nzrks\" (UID: \"61372c64-9070-4751-b720-a4016030cf02\") " pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393447 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-kubelet\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-run-openvswitch\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393503 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-registration-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393512 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-systemd-units\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393543 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-multus-socket-dir-parent\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393570 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-var-lib-cni-multus\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-multus-conf-dir\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393650 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szxh8\" (UniqueName: \"kubernetes.io/projected/dc7a9b0c-42a9-4562-a03a-27dca913446a-kube-api-access-szxh8\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393667 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1237c950-1db9-42f8-be43-fc6424f2ae2c-env-overrides\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61372c64-9070-4751-b720-a4016030cf02-tmp-dir\") pod \"node-resolver-nzrks\" (UID: \"61372c64-9070-4751-b720-a4016030cf02\") " pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393685 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-device-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.393701 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393722 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-run-netns\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.395852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393747 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-var-lib-cni-bin\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.393788 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs podName:dc7a9b0c-42a9-4562-a03a-27dca913446a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:28.893755135 +0000 UTC m=+3.048677600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs") pod "network-metrics-daemon-dqcwj" (UID: "dc7a9b0c-42a9-4562-a03a-27dca913446a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393806 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-sysconfig\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-node-log\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-kubernetes\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393875 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-systemd\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-cni-netd\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393912 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-kubelet\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394186 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-var-lib-cni-multus\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394223 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1237c950-1db9-42f8-be43-fc6424f2ae2c-env-overrides\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394233 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-run-openvswitch\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393704 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-host-run-netns\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394270 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe336864-d139-416a-b1cb-afe14a9db883-multus-daemon-config\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394286 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe336864-d139-416a-b1cb-afe14a9db883-multus-socket-dir-parent\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.393572 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8797a06c-9f6d-4c9f-b8e1-36e99724079b-sys\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394287 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac-konnectivity-ca\") pod \"konnectivity-agent-zjfsg\" (UID: \"5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac\") " pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394329 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9g6k\" (UniqueName: \"kubernetes.io/projected/e29be9aa-ef19-4770-b277-bce09909acde-kube-api-access-l9g6k\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.396647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-run-netns\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.397266 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394387 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-sys-fs\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.397266 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e29be9aa-ef19-4770-b277-bce09909acde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.397266 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394495 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1237c950-1db9-42f8-be43-fc6424f2ae2c-host-run-netns\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.397266 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.394544 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe336864-d139-416a-b1cb-afe14a9db883-cni-binary-copy\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.397266 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.395119 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe336864-d139-416a-b1cb-afe14a9db883-multus-daemon-config\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.397266 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.396298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1237c950-1db9-42f8-be43-fc6424f2ae2c-ovn-node-metrics-cert\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.402563 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.402531 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8797a06c-9f6d-4c9f-b8e1-36e99724079b-tmp\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.402563 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.402554 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:28.402940 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.402575 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:28.402940 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.402589 2569 projected.go:194] Error preparing data for projected volume kube-api-access-pmpjf for pod openshift-network-diagnostics/network-check-target-zw7vm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:28.402940 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.402578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5675\" (UniqueName: \"kubernetes.io/projected/baa916b1-56d7-46e4-9ccb-a3794c262e34-kube-api-access-r5675\") pod \"iptables-alerter-9fc98\" (UID: \"baa916b1-56d7-46e4-9ccb-a3794c262e34\") " pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.402940 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.402658 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf podName:32951250-c04f-4a66-a62c-e1372b1c84d0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:28.9026423 +0000 UTC m=+3.057564759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pmpjf" (UniqueName: "kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf") pod "network-check-target-zw7vm" (UID: "32951250-c04f-4a66-a62c-e1372b1c84d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:28.402940 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.402718 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8797a06c-9f6d-4c9f-b8e1-36e99724079b-etc-tuned\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.403361 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.403329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqrw\" (UniqueName: \"kubernetes.io/projected/1237c950-1db9-42f8-be43-fc6424f2ae2c-kube-api-access-gfqrw\") pod \"ovnkube-node-f6dwj\" (UID: \"1237c950-1db9-42f8-be43-fc6424f2ae2c\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.404713 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.404694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46hzd\" (UniqueName: \"kubernetes.io/projected/fe336864-d139-416a-b1cb-afe14a9db883-kube-api-access-46hzd\") pod \"multus-cj68p\" (UID: \"fe336864-d139-416a-b1cb-afe14a9db883\") " pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.405175 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.405153 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp7xz\" (UniqueName: \"kubernetes.io/projected/61372c64-9070-4751-b720-a4016030cf02-kube-api-access-bp7xz\") pod \"node-resolver-nzrks\" (UID: \"61372c64-9070-4751-b720-a4016030cf02\") " pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.405353 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.405333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jh5s\" (UniqueName: \"kubernetes.io/projected/8797a06c-9f6d-4c9f-b8e1-36e99724079b-kube-api-access-7jh5s\") pod \"tuned-tttgj\" (UID: \"8797a06c-9f6d-4c9f-b8e1-36e99724079b\") " pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.406123 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.406102 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szxh8\" (UniqueName: \"kubernetes.io/projected/dc7a9b0c-42a9-4562-a03a-27dca913446a-kube-api-access-szxh8\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:28.495687 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495633 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29bbf816-c174-4330-b3f2-ded908db0f6a-host\") pod \"node-ca-xwbp9\" (UID: \"29bbf816-c174-4330-b3f2-ded908db0f6a\") " pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.495687 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e29be9aa-ef19-4770-b277-bce09909acde-cni-binary-copy\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.495937 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495725 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-cnibin\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.495937 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495745 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29bbf816-c174-4330-b3f2-ded908db0f6a-host\") pod \"node-ca-xwbp9\" (UID: \"29bbf816-c174-4330-b3f2-ded908db0f6a\") " pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.495937 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-registration-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.495937 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-registration-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.495937 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-cnibin\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.495937 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-device-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495951 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac-konnectivity-ca\") pod \"konnectivity-agent-zjfsg\" (UID: \"5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac\") " pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.495979 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9g6k\" (UniqueName: \"kubernetes.io/projected/e29be9aa-ef19-4770-b277-bce09909acde-kube-api-access-l9g6k\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-sys-fs\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496013 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-device-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496034 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e29be9aa-ef19-4770-b277-bce09909acde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496085 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-etc-selinux\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496098 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-sys-fs\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e29be9aa-ef19-4770-b277-bce09909acde-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqp96\" (UniqueName: \"kubernetes.io/projected/29bbf816-c174-4330-b3f2-ded908db0f6a-kube-api-access-zqp96\") pod \"node-ca-xwbp9\" (UID: \"29bbf816-c174-4330-b3f2-ded908db0f6a\") " pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496175 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-system-cni-dir\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496203 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29bbf816-c174-4330-b3f2-ded908db0f6a-serviceca\") pod \"node-ca-xwbp9\" (UID: \"29bbf816-c174-4330-b3f2-ded908db0f6a\") " pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.496253 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496241 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4q2dl\" (UniqueName: \"kubernetes.io/projected/741b498a-5409-4b33-8b00-977a99dc68e9-kube-api-access-4q2dl\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac-agent-certs\") pod \"konnectivity-agent-zjfsg\" (UID: \"5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac\") " pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496293 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-os-release\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e29be9aa-ef19-4770-b277-bce09909acde-cni-binary-copy\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496318 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496355 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496373 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-socket-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496463 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-socket-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496495 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496493 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496525 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-system-cni-dir\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/741b498a-5409-4b33-8b00-977a99dc68e9-etc-selinux\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496615 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e29be9aa-ef19-4770-b277-bce09909acde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.496767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496703 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e29be9aa-ef19-4770-b277-bce09909acde-os-release\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.497350 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.496940 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac-konnectivity-ca\") pod \"konnectivity-agent-zjfsg\" (UID: \"5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac\") " pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:28.497350 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.497043 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29bbf816-c174-4330-b3f2-ded908db0f6a-serviceca\") pod \"node-ca-xwbp9\" (UID: \"29bbf816-c174-4330-b3f2-ded908db0f6a\") " pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.497350 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.497218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e29be9aa-ef19-4770-b277-bce09909acde-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.499141 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.499121 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac-agent-certs\") pod \"konnectivity-agent-zjfsg\" (UID: \"5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac\") " pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:28.505109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.505024 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9g6k\" (UniqueName: \"kubernetes.io/projected/e29be9aa-ef19-4770-b277-bce09909acde-kube-api-access-l9g6k\") pod \"multus-additional-cni-plugins-6lkhk\" (UID: \"e29be9aa-ef19-4770-b277-bce09909acde\") " pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.505239 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.505162 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q2dl\" (UniqueName: \"kubernetes.io/projected/741b498a-5409-4b33-8b00-977a99dc68e9-kube-api-access-4q2dl\") pod \"aws-ebs-csi-driver-node-45cbg\" (UID: \"741b498a-5409-4b33-8b00-977a99dc68e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.505418 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.505402 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqp96\" (UniqueName: \"kubernetes.io/projected/29bbf816-c174-4330-b3f2-ded908db0f6a-kube-api-access-zqp96\") pod \"node-ca-xwbp9\" (UID: \"29bbf816-c174-4330-b3f2-ded908db0f6a\") " pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.514425 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.514392 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:28.582721 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.582684 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9fc98" Apr 23 13:32:28.593604 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.593575 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tttgj" Apr 23 13:32:28.605548 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.605512 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nzrks" Apr 23 13:32:28.610294 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.610271 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cj68p" Apr 23 13:32:28.618026 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.617997 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:28.627903 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.627874 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:28.632860 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.632830 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:28.634685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.634660 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" Apr 23 13:32:28.643508 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.643478 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" Apr 23 13:32:28.651284 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.651254 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xwbp9" Apr 23 13:32:28.899688 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:28.899637 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:28.899858 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.899770 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:28.899858 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:28.899842 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs podName:dc7a9b0c-42a9-4562-a03a-27dca913446a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:29.899826912 +0000 UTC m=+4.054749357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs") pod "network-metrics-daemon-dqcwj" (UID: "dc7a9b0c-42a9-4562-a03a-27dca913446a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:29.000954 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.000905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:29.001181 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:29.001121 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:29.001181 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:29.001147 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:29.001181 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:29.001160 2569 projected.go:194] Error preparing data for projected volume kube-api-access-pmpjf for pod openshift-network-diagnostics/network-check-target-zw7vm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:29.001328 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:29.001232 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf podName:32951250-c04f-4a66-a62c-e1372b1c84d0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:30.001211237 +0000 UTC m=+4.156133684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pmpjf" (UniqueName: "kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf") pod "network-check-target-zw7vm" (UID: "32951250-c04f-4a66-a62c-e1372b1c84d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:29.097572 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:29.097525 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29bbf816_c174_4330_b3f2_ded908db0f6a.slice/crio-f3978e99af248147b530f47c9b0c96a2fbe21a8c56dd3c183ab56bb289ad4b1c WatchSource:0}: Error finding container f3978e99af248147b530f47c9b0c96a2fbe21a8c56dd3c183ab56bb289ad4b1c: Status 404 returned error can't find the container with id f3978e99af248147b530f47c9b0c96a2fbe21a8c56dd3c183ab56bb289ad4b1c Apr 23 13:32:29.101637 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:29.101609 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe336864_d139_416a_b1cb_afe14a9db883.slice/crio-dd65a423f5fe405a6b6b937f306ea6d3423a8a2d34eea44045da575c532daf83 WatchSource:0}: Error finding container dd65a423f5fe405a6b6b937f306ea6d3423a8a2d34eea44045da575c532daf83: Status 404 returned error can't find the container with id dd65a423f5fe405a6b6b937f306ea6d3423a8a2d34eea44045da575c532daf83 Apr 23 13:32:29.102103 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:29.102083 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1237c950_1db9_42f8_be43_fc6424f2ae2c.slice/crio-10b9ee6b8ed71a7f844caa3428d5e1a704c2a4cdf7b595efffa4d92d741fa7e3 WatchSource:0}: Error finding container 10b9ee6b8ed71a7f844caa3428d5e1a704c2a4cdf7b595efffa4d92d741fa7e3: Status 404 returned error can't find the container with id 10b9ee6b8ed71a7f844caa3428d5e1a704c2a4cdf7b595efffa4d92d741fa7e3 Apr 23 13:32:29.103074 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:29.103036 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaa916b1_56d7_46e4_9ccb_a3794c262e34.slice/crio-517a7c7b169e8930de43b6d1b3f5b329d564f5b4e57173404f06c3566347f3e9 WatchSource:0}: Error finding container 517a7c7b169e8930de43b6d1b3f5b329d564f5b4e57173404f06c3566347f3e9: Status 404 returned error can't find the container with id 517a7c7b169e8930de43b6d1b3f5b329d564f5b4e57173404f06c3566347f3e9 Apr 23 13:32:29.104403 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:29.104353 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61372c64_9070_4751_b720_a4016030cf02.slice/crio-131cb079afbd7fab58b1b88b9bdc947d5a70dc0bff09939ba28b3480fa2ad859 WatchSource:0}: Error finding container 131cb079afbd7fab58b1b88b9bdc947d5a70dc0bff09939ba28b3480fa2ad859: Status 404 returned error can't find the container with id 131cb079afbd7fab58b1b88b9bdc947d5a70dc0bff09939ba28b3480fa2ad859 Apr 23 13:32:29.106450 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:29.106406 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod741b498a_5409_4b33_8b00_977a99dc68e9.slice/crio-2d0c4b01ce21bd5f471b12c0cace9f5cf32e3eced0553dad6abd7534257e926a WatchSource:0}: Error finding container 2d0c4b01ce21bd5f471b12c0cace9f5cf32e3eced0553dad6abd7534257e926a: Status 404 returned error can't find the container with id 2d0c4b01ce21bd5f471b12c0cace9f5cf32e3eced0553dad6abd7534257e926a Apr 23 13:32:29.106721 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:29.106674 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5af36bbd_1993_4d2e_ac6c_1f12cb3f5fac.slice/crio-0ee27d4ba26a9b75e0b957cbb748b5ed6a880ccc87a6ad6ce9c0f84f073ab581 WatchSource:0}: Error finding container 0ee27d4ba26a9b75e0b957cbb748b5ed6a880ccc87a6ad6ce9c0f84f073ab581: Status 404 returned error can't find the container with id 0ee27d4ba26a9b75e0b957cbb748b5ed6a880ccc87a6ad6ce9c0f84f073ab581 Apr 23 13:32:29.109293 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:32:29.109266 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8797a06c_9f6d_4c9f_b8e1_36e99724079b.slice/crio-536aa3913b2a750e49d12c62d2091a816370a95f00df1aa89bfd60ed9952fa1f WatchSource:0}: Error finding container 536aa3913b2a750e49d12c62d2091a816370a95f00df1aa89bfd60ed9952fa1f: Status 404 returned error can't find the container with id 536aa3913b2a750e49d12c62d2091a816370a95f00df1aa89bfd60ed9952fa1f Apr 23 13:32:29.319638 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.319425 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:27 +0000 UTC" deadline="2027-10-31 08:37:53.997416493 +0000 UTC" Apr 23 13:32:29.319638 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.319634 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13339h5m24.677788286s" Apr 23 13:32:29.416215 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.416104 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:29.416372 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:29.416245 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:29.424231 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.424187 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" event={"ID":"741b498a-5409-4b33-8b00-977a99dc68e9","Type":"ContainerStarted","Data":"2d0c4b01ce21bd5f471b12c0cace9f5cf32e3eced0553dad6abd7534257e926a"} Apr 23 13:32:29.425827 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.425800 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9fc98" event={"ID":"baa916b1-56d7-46e4-9ccb-a3794c262e34","Type":"ContainerStarted","Data":"517a7c7b169e8930de43b6d1b3f5b329d564f5b4e57173404f06c3566347f3e9"} Apr 23 13:32:29.427244 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.427217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nzrks" event={"ID":"61372c64-9070-4751-b720-a4016030cf02","Type":"ContainerStarted","Data":"131cb079afbd7fab58b1b88b9bdc947d5a70dc0bff09939ba28b3480fa2ad859"} Apr 23 13:32:29.429026 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.429001 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" event={"ID":"e29be9aa-ef19-4770-b277-bce09909acde","Type":"ContainerStarted","Data":"2c4990d9b075317d36ec0128326b9af0a12ed83a3bef6594e9740bfacdceb14f"} Apr 23 13:32:29.430207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.430178 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zjfsg" event={"ID":"5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac","Type":"ContainerStarted","Data":"0ee27d4ba26a9b75e0b957cbb748b5ed6a880ccc87a6ad6ce9c0f84f073ab581"} Apr 23 13:32:29.431567 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.431541 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerStarted","Data":"10b9ee6b8ed71a7f844caa3428d5e1a704c2a4cdf7b595efffa4d92d741fa7e3"} Apr 23 13:32:29.432843 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.432816 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cj68p" event={"ID":"fe336864-d139-416a-b1cb-afe14a9db883","Type":"ContainerStarted","Data":"dd65a423f5fe405a6b6b937f306ea6d3423a8a2d34eea44045da575c532daf83"} Apr 23 13:32:29.436935 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.435993 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xwbp9" event={"ID":"29bbf816-c174-4330-b3f2-ded908db0f6a","Type":"ContainerStarted","Data":"f3978e99af248147b530f47c9b0c96a2fbe21a8c56dd3c183ab56bb289ad4b1c"} Apr 23 13:32:29.439968 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.439933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" event={"ID":"9b4440db6557536c217fdb95da13736d","Type":"ContainerStarted","Data":"46a0ac22c8e125938e481ca870f3b440947810cf9f4b29aadc0fb862acd263dd"} Apr 23 13:32:29.445198 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.444966 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tttgj" event={"ID":"8797a06c-9f6d-4c9f-b8e1-36e99724079b","Type":"ContainerStarted","Data":"536aa3913b2a750e49d12c62d2091a816370a95f00df1aa89bfd60ed9952fa1f"} Apr 23 13:32:29.453808 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.453741 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" podStartSLOduration=2.453718286 podStartE2EDuration="2.453718286s" podCreationTimestamp="2026-04-23 13:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:29.453127693 +0000 UTC m=+3.608050159" watchObservedRunningTime="2026-04-23 13:32:29.453718286 +0000 UTC m=+3.608640753" Apr 23 13:32:29.907774 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:29.907683 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:29.907934 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:29.907848 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:29.907934 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:29.907916 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs podName:dc7a9b0c-42a9-4562-a03a-27dca913446a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:31.907895127 +0000 UTC m=+6.062817579 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs") pod "network-metrics-daemon-dqcwj" (UID: "dc7a9b0c-42a9-4562-a03a-27dca913446a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:30.009140 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:30.009029 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:30.009364 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:30.009216 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:30.009364 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:30.009236 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:30.009364 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:30.009248 2569 projected.go:194] Error preparing data for projected volume kube-api-access-pmpjf for pod openshift-network-diagnostics/network-check-target-zw7vm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:30.009364 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:30.009311 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf podName:32951250-c04f-4a66-a62c-e1372b1c84d0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:32.009293128 +0000 UTC m=+6.164215591 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pmpjf" (UniqueName: "kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf") pod "network-check-target-zw7vm" (UID: "32951250-c04f-4a66-a62c-e1372b1c84d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:30.417248 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:30.417217 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:30.417747 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:30.417353 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:30.458176 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:30.457158 2569 generic.go:358] "Generic (PLEG): container finished" podID="c2eebab4ba0cac1e68c6bccde729de79" containerID="d67f8e8a48a0e6645f25a2b597475bef98b1ddecee35ed63de1f48d1198fa9f1" exitCode=0 Apr 23 13:32:30.458176 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:30.458124 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" event={"ID":"c2eebab4ba0cac1e68c6bccde729de79","Type":"ContainerDied","Data":"d67f8e8a48a0e6645f25a2b597475bef98b1ddecee35ed63de1f48d1198fa9f1"} Apr 23 13:32:31.416526 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:31.416492 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:31.416719 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:31.416641 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:31.474049 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:31.473312 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" event={"ID":"c2eebab4ba0cac1e68c6bccde729de79","Type":"ContainerStarted","Data":"bd53f8376457d0129c32f866430b0c6036029e91dc0bbeb713229f028aa36d68"} Apr 23 13:32:31.925466 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:31.925422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:31.925641 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:31.925574 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:31.925641 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:31.925637 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs podName:dc7a9b0c-42a9-4562-a03a-27dca913446a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:35.925618188 +0000 UTC m=+10.080540640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs") pod "network-metrics-daemon-dqcwj" (UID: "dc7a9b0c-42a9-4562-a03a-27dca913446a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:32.026664 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:32.026623 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:32.026852 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:32.026800 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:32.026852 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:32.026816 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:32.026852 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:32.026825 2569 projected.go:194] Error preparing data for projected volume kube-api-access-pmpjf for pod openshift-network-diagnostics/network-check-target-zw7vm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:32.026975 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:32.026879 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf podName:32951250-c04f-4a66-a62c-e1372b1c84d0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:36.026862538 +0000 UTC m=+10.181784984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pmpjf" (UniqueName: "kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf") pod "network-check-target-zw7vm" (UID: "32951250-c04f-4a66-a62c-e1372b1c84d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:32.416542 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:32.416469 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:32.416743 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:32.416572 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:33.416679 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:33.416539 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:33.417158 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:33.416704 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:34.419562 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:34.419525 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:34.420017 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:34.419638 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:35.416304 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:35.416263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:35.416493 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:35.416401 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:35.959393 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:35.959356 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:35.959814 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:35.959513 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:35.959814 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:35.959588 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs podName:dc7a9b0c-42a9-4562-a03a-27dca913446a nodeName:}" failed. No retries permitted until 2026-04-23 13:32:43.959567722 +0000 UTC m=+18.114490168 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs") pod "network-metrics-daemon-dqcwj" (UID: "dc7a9b0c-42a9-4562-a03a-27dca913446a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:36.060978 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:36.060345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:36.060978 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:36.060536 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:36.060978 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:36.060555 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:36.060978 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:36.060567 2569 projected.go:194] Error preparing data for projected volume kube-api-access-pmpjf for pod openshift-network-diagnostics/network-check-target-zw7vm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:36.060978 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:36.060631 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf podName:32951250-c04f-4a66-a62c-e1372b1c84d0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:44.06061146 +0000 UTC m=+18.215533909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pmpjf" (UniqueName: "kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf") pod "network-check-target-zw7vm" (UID: "32951250-c04f-4a66-a62c-e1372b1c84d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:36.420507 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:36.420472 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:36.420674 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:36.420594 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:37.416617 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:37.416579 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:37.417079 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:37.416721 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:38.419389 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:38.419363 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:38.419848 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:38.419458 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:39.416219 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:39.416176 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:39.416403 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:39.416314 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:40.416528 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:40.416491 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:40.416975 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:40.416621 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:41.416302 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:41.416268 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:41.416478 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:41.416411 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:42.415961 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:42.415922 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:42.416522 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:42.416036 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:43.416477 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:43.416437 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:43.416953 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:43.416589 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:44.016868 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:44.016827 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:44.017096 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:44.016999 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:44.017169 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:44.017102 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs podName:dc7a9b0c-42a9-4562-a03a-27dca913446a nodeName:}" failed. No retries permitted until 2026-04-23 13:33:00.017079048 +0000 UTC m=+34.172001505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs") pod "network-metrics-daemon-dqcwj" (UID: "dc7a9b0c-42a9-4562-a03a-27dca913446a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:44.118237 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:44.118190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:44.118398 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:44.118330 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:44.118398 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:44.118345 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:44.118398 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:44.118357 2569 projected.go:194] Error preparing data for projected volume kube-api-access-pmpjf for pod openshift-network-diagnostics/network-check-target-zw7vm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:44.118528 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:44.118428 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf podName:32951250-c04f-4a66-a62c-e1372b1c84d0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:00.118408638 +0000 UTC m=+34.273331107 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pmpjf" (UniqueName: "kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf") pod "network-check-target-zw7vm" (UID: "32951250-c04f-4a66-a62c-e1372b1c84d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:44.418287 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:44.418196 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:44.418716 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:44.418335 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:45.415948 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:45.415911 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:45.416199 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:45.416050 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:46.418273 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:46.418246 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:46.418653 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:46.418340 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:47.417351 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.416829 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:47.417505 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:47.417481 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:47.508935 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.508907 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:32:47.509710 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.509185 2569 generic.go:358] "Generic (PLEG): container finished" podID="1237c950-1db9-42f8-be43-fc6424f2ae2c" containerID="6541d86fad02cb91bd8a64c9cf60cbbdf10f656f78855601e275a794aaea90f3" exitCode=1 Apr 23 13:32:47.509710 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.509245 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerStarted","Data":"95267c7fd0330557ddfa7cfbf1222f34b3520ab6eeaa277ce6cc63dddeb650a3"} Apr 23 13:32:47.509710 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.509272 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerStarted","Data":"2e6885c8c0253dd08f7ab53d6023d8652bd8c54391cc6c6409483e46a9ad8e34"} Apr 23 13:32:47.509710 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.509281 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerStarted","Data":"d02746a08e8d46951837692de37f54190f49e94bec20c3e531f487c54636452f"} Apr 23 13:32:47.509710 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.509292 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerStarted","Data":"f1d77bb61c49c43c3e03f411b533eb7825ea029d83109af7cf031a0f41e53e51"} Apr 23 13:32:47.509710 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.509301 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerDied","Data":"6541d86fad02cb91bd8a64c9cf60cbbdf10f656f78855601e275a794aaea90f3"} Apr 23 13:32:47.509710 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.509310 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerStarted","Data":"a0f99dfda88c1030916d01eb012a1163468d96f94087359ff6e01495fde9c923"} Apr 23 13:32:47.510508 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.510486 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cj68p" event={"ID":"fe336864-d139-416a-b1cb-afe14a9db883","Type":"ContainerStarted","Data":"2f5588daf2caa9b2fc8f8ff859bea891aaf28fe40070a12916bdb8cfb14d5ced"} Apr 23 13:32:47.511680 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.511654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xwbp9" event={"ID":"29bbf816-c174-4330-b3f2-ded908db0f6a","Type":"ContainerStarted","Data":"733eac60b62d27e6ae2f947305fc182cab14b40b2e496ffc20f7e804995383a0"} Apr 23 13:32:47.512695 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.512677 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tttgj" event={"ID":"8797a06c-9f6d-4c9f-b8e1-36e99724079b","Type":"ContainerStarted","Data":"9636a785b73e1f1522b8b4fdc98fb14e80b4e7536c899599c152ee8cb6a16b32"} Apr 23 13:32:47.513815 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.513786 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" event={"ID":"741b498a-5409-4b33-8b00-977a99dc68e9","Type":"ContainerStarted","Data":"91772d435ee7f455d89aedb2df89a438e9c73b9ca0edc4dc9069aee27c3aef7f"} Apr 23 13:32:47.515101 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.515077 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nzrks" event={"ID":"61372c64-9070-4751-b720-a4016030cf02","Type":"ContainerStarted","Data":"7033b43e3ec48dcb90bc2eb4a049c7cb10797997b713a2e0a2c73710f9d76728"} Apr 23 13:32:47.516271 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.516248 2569 generic.go:358] "Generic (PLEG): container finished" podID="e29be9aa-ef19-4770-b277-bce09909acde" containerID="ed5ab11bf3890078562aa09308277d272ffe1420783f238df2736d53deca85ff" exitCode=0 Apr 23 13:32:47.516361 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.516313 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" event={"ID":"e29be9aa-ef19-4770-b277-bce09909acde","Type":"ContainerDied","Data":"ed5ab11bf3890078562aa09308277d272ffe1420783f238df2736d53deca85ff"} Apr 23 13:32:47.517467 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.517447 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zjfsg" event={"ID":"5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac","Type":"ContainerStarted","Data":"dccf88bb52d8a7446d117ed06050352ce6e1583bf01c32f827627e7b4719b9bf"} Apr 23 13:32:47.527244 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.527200 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cj68p" podStartSLOduration=4.003793393 podStartE2EDuration="21.527172211s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:29.103280698 +0000 UTC m=+3.258203156" lastFinishedPulling="2026-04-23 13:32:46.626659521 +0000 UTC m=+20.781581974" observedRunningTime="2026-04-23 13:32:47.526845743 +0000 UTC m=+21.681768211" watchObservedRunningTime="2026-04-23 13:32:47.527172211 +0000 UTC m=+21.682094673" Apr 23 13:32:47.527488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.527467 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" podStartSLOduration=20.527460776 podStartE2EDuration="20.527460776s" podCreationTimestamp="2026-04-23 13:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:31.493172346 +0000 UTC m=+5.648094817" watchObservedRunningTime="2026-04-23 13:32:47.527460776 +0000 UTC m=+21.682383244" Apr 23 13:32:47.540813 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.540764 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nzrks" podStartSLOduration=4.05842503 podStartE2EDuration="21.540747587s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:29.106947718 +0000 UTC m=+3.261870167" lastFinishedPulling="2026-04-23 13:32:46.589270271 +0000 UTC m=+20.744192724" observedRunningTime="2026-04-23 13:32:47.540155602 +0000 UTC m=+21.695078070" watchObservedRunningTime="2026-04-23 13:32:47.540747587 +0000 UTC m=+21.695670054" Apr 23 13:32:47.554598 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.554550 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zjfsg" podStartSLOduration=4.408024244 podStartE2EDuration="21.554533767s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:29.108857906 +0000 UTC m=+3.263780365" lastFinishedPulling="2026-04-23 13:32:46.255367439 +0000 UTC m=+20.410289888" observedRunningTime="2026-04-23 13:32:47.554016182 +0000 UTC m=+21.708938649" watchObservedRunningTime="2026-04-23 13:32:47.554533767 +0000 UTC m=+21.709456234" Apr 23 13:32:47.590895 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.590838 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xwbp9" podStartSLOduration=8.868357733 podStartE2EDuration="21.590822688s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:29.099306849 +0000 UTC m=+3.254229297" lastFinishedPulling="2026-04-23 13:32:41.821771795 +0000 UTC m=+15.976694252" observedRunningTime="2026-04-23 13:32:47.590756989 +0000 UTC m=+21.745679458" watchObservedRunningTime="2026-04-23 13:32:47.590822688 +0000 UTC m=+21.745745156" Apr 23 13:32:47.612086 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:47.611962 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tttgj" podStartSLOduration=4.134281907 podStartE2EDuration="21.611940465s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:29.11160791 +0000 UTC m=+3.266530355" lastFinishedPulling="2026-04-23 13:32:46.589266458 +0000 UTC m=+20.744188913" observedRunningTime="2026-04-23 13:32:47.611644396 +0000 UTC m=+21.766566865" watchObservedRunningTime="2026-04-23 13:32:47.611940465 +0000 UTC m=+21.766862934" Apr 23 13:32:48.122140 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:48.122114 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:32:48.355552 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:48.355443 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:32:48.122135603Z","UUID":"9e4140b8-58f3-40bd-9c1b-e0670a2a9131","Handler":null,"Name":"","Endpoint":""} Apr 23 13:32:48.357388 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:48.357366 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:32:48.357524 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:48.357396 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:32:48.416259 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:48.416202 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:48.416441 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:48.416343 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:48.522447 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:48.522276 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" event={"ID":"741b498a-5409-4b33-8b00-977a99dc68e9","Type":"ContainerStarted","Data":"8580b0687d665b9e9e13f8a3bfbcb0421bf80ed396d781ef8f988053f3be0146"} Apr 23 13:32:48.524375 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:48.524341 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9fc98" event={"ID":"baa916b1-56d7-46e4-9ccb-a3794c262e34","Type":"ContainerStarted","Data":"8f14b155ebb95d78dc26ee1381041645945d37083a77724e63f41ed62acac8dc"} Apr 23 13:32:48.544916 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:48.544865 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9fc98" podStartSLOduration=5.395405238 podStartE2EDuration="22.544850755s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:29.105818861 +0000 UTC m=+3.260741314" lastFinishedPulling="2026-04-23 13:32:46.255264385 +0000 UTC m=+20.410186831" observedRunningTime="2026-04-23 13:32:48.54478884 +0000 UTC m=+22.699711309" watchObservedRunningTime="2026-04-23 13:32:48.544850755 +0000 UTC m=+22.699773223" Apr 23 13:32:48.566000 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:48.565946 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:49.415827 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:49.415790 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:49.416035 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:49.415942 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:49.528222 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:49.528190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" event={"ID":"741b498a-5409-4b33-8b00-977a99dc68e9","Type":"ContainerStarted","Data":"7bc832dde81734495205ff38217c3d992b5aae9b53ca68b5b79787bd56c9f570"} Apr 23 13:32:49.547378 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:49.547314 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-45cbg" podStartSLOduration=3.456856264 podStartE2EDuration="23.547295802s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:29.108417705 +0000 UTC m=+3.263340163" lastFinishedPulling="2026-04-23 13:32:49.198857246 +0000 UTC m=+23.353779701" observedRunningTime="2026-04-23 13:32:49.547146395 +0000 UTC m=+23.702068863" watchObservedRunningTime="2026-04-23 13:32:49.547295802 +0000 UTC m=+23.702218269" Apr 23 13:32:50.416224 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:50.415973 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:50.416412 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:50.416326 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:50.533758 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:50.533730 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:32:50.534276 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:50.534157 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerStarted","Data":"c2b527981b1220fa47b9b8278aa9ff7d3b994a46ea1b2643e2cce832dd59177c"} Apr 23 13:32:51.415950 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:51.415914 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:51.416154 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:51.416076 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:52.200903 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.200707 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:52.201617 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.201351 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:52.416153 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.416113 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:52.416328 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:52.416246 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:52.540917 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.540889 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:32:52.541271 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.541249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerStarted","Data":"8632b1a42c645acc005c1ea72cc0bb62c78e5145c59e4182b6556ae170a55f55"} Apr 23 13:32:52.541636 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.541600 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:52.541882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.541862 2569 scope.go:117] "RemoveContainer" containerID="6541d86fad02cb91bd8a64c9cf60cbbdf10f656f78855601e275a794aaea90f3" Apr 23 13:32:52.545023 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.544990 2569 generic.go:358] "Generic (PLEG): container finished" podID="e29be9aa-ef19-4770-b277-bce09909acde" containerID="fa24b7fef72af406661b86126f97244a41be687172e0c1d01d8d891098b545e4" exitCode=0 Apr 23 13:32:52.545149 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.545086 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" event={"ID":"e29be9aa-ef19-4770-b277-bce09909acde","Type":"ContainerDied","Data":"fa24b7fef72af406661b86126f97244a41be687172e0c1d01d8d891098b545e4"} Apr 23 13:32:52.546023 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.546006 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zjfsg" Apr 23 13:32:52.559901 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:52.559882 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:53.416311 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.416274 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:53.416671 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:53.416395 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:53.548893 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.548860 2569 generic.go:358] "Generic (PLEG): container finished" podID="e29be9aa-ef19-4770-b277-bce09909acde" containerID="55d5f528a4c0af6df1c9bb5c7b437566b6f8c822e26f5db529f18335066eab33" exitCode=0 Apr 23 13:32:53.549094 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.548938 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" event={"ID":"e29be9aa-ef19-4770-b277-bce09909acde","Type":"ContainerDied","Data":"55d5f528a4c0af6df1c9bb5c7b437566b6f8c822e26f5db529f18335066eab33"} Apr 23 13:32:53.552358 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.552335 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:32:53.552766 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.552736 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" event={"ID":"1237c950-1db9-42f8-be43-fc6424f2ae2c","Type":"ContainerStarted","Data":"91758a5632afcef487eff34250723cdc04e68eafda7f49b5733a5d4c85f5ec85"} Apr 23 13:32:53.553099 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.553080 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:53.553194 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.553108 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:53.569644 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.569609 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:32:53.602448 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.602392 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" podStartSLOduration=10.034696269 podStartE2EDuration="27.60237604s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:29.104569333 +0000 UTC m=+3.259491794" lastFinishedPulling="2026-04-23 13:32:46.672249105 +0000 UTC m=+20.827171565" observedRunningTime="2026-04-23 13:32:53.601120225 +0000 UTC m=+27.756042693" watchObservedRunningTime="2026-04-23 13:32:53.60237604 +0000 UTC m=+27.757298508" Apr 23 13:32:53.673206 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.673174 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dqcwj"] Apr 23 13:32:53.673379 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.673301 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:53.673453 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:53.673421 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:53.675680 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.675650 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zw7vm"] Apr 23 13:32:53.675803 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:53.675779 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:53.675896 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:53.675877 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:54.556685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:54.556650 2569 generic.go:358] "Generic (PLEG): container finished" podID="e29be9aa-ef19-4770-b277-bce09909acde" containerID="44be00e8cdeb00246e6d57fd8ae295afd29dda835b218cdd461b80b0a4d6fc06" exitCode=0 Apr 23 13:32:54.557045 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:54.556741 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" event={"ID":"e29be9aa-ef19-4770-b277-bce09909acde","Type":"ContainerDied","Data":"44be00e8cdeb00246e6d57fd8ae295afd29dda835b218cdd461b80b0a4d6fc06"} Apr 23 13:32:55.415771 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:55.415737 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:55.415973 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:55.415851 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:55.415973 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:55.415904 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:55.416105 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:55.416006 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:57.416273 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:57.416235 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:57.416778 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:57.416247 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:57.416778 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:57.416371 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:57.416778 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:57.416453 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:59.416240 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.416200 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:32:59.416863 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.416200 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:32:59.416863 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:59.416368 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zw7vm" podUID="32951250-c04f-4a66-a62c-e1372b1c84d0" Apr 23 13:32:59.416863 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:59.416400 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dqcwj" podUID="dc7a9b0c-42a9-4562-a03a-27dca913446a" Apr 23 13:32:59.688919 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.688838 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeReady" Apr 23 13:32:59.689109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.688985 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:32:59.737719 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.737686 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h8fxx"] Apr 23 13:32:59.770342 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.770310 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ggnhj"] Apr 23 13:32:59.770508 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.770432 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.772927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.772899 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:32:59.773105 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.772949 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:32:59.773291 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.773275 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtx8h\"" Apr 23 13:32:59.789557 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.789530 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h8fxx"] Apr 23 13:32:59.789557 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.789560 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ggnhj"] Apr 23 13:32:59.789723 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.789677 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:32:59.792454 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.792428 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8bjdp\"" Apr 23 13:32:59.792601 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.792472 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:32:59.792601 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.792493 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:32:59.792601 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.792548 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:32:59.835465 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.835428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33d8f26a-427d-4263-9b87-13337ac3a834-config-volume\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.835657 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.835512 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.835657 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.835533 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk278\" (UniqueName: \"kubernetes.io/projected/33d8f26a-427d-4263-9b87-13337ac3a834-kube-api-access-jk278\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.835657 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.835569 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33d8f26a-427d-4263-9b87-13337ac3a834-tmp-dir\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.936712 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.936668 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33d8f26a-427d-4263-9b87-13337ac3a834-config-volume\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.936881 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.936725 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpdmd\" (UniqueName: \"kubernetes.io/projected/bb994adb-00a6-4601-83e9-80e43ab53049-kube-api-access-wpdmd\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:32:59.936881 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.936749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:32:59.936881 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.936790 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.936881 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.936815 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jk278\" (UniqueName: \"kubernetes.io/projected/33d8f26a-427d-4263-9b87-13337ac3a834-kube-api-access-jk278\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.936881 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.936852 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33d8f26a-427d-4263-9b87-13337ac3a834-tmp-dir\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.937241 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:59.937219 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:59.937321 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:32:59.937302 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls podName:33d8f26a-427d-4263-9b87-13337ac3a834 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:00.437278541 +0000 UTC m=+34.592200988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls") pod "dns-default-h8fxx" (UID: "33d8f26a-427d-4263-9b87-13337ac3a834") : secret "dns-default-metrics-tls" not found Apr 23 13:32:59.937638 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.937608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33d8f26a-427d-4263-9b87-13337ac3a834-config-volume\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.948089 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.948002 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk278\" (UniqueName: \"kubernetes.io/projected/33d8f26a-427d-4263-9b87-13337ac3a834-kube-api-access-jk278\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:32:59.953578 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:32:59.953542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33d8f26a-427d-4263-9b87-13337ac3a834-tmp-dir\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:33:00.038145 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:00.038105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:33:00.038334 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:00.038212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpdmd\" (UniqueName: \"kubernetes.io/projected/bb994adb-00a6-4601-83e9-80e43ab53049-kube-api-access-wpdmd\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:33:00.038334 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:00.038239 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:33:00.038334 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.038287 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:33:00.038500 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.038343 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:00.038500 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.038367 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs podName:dc7a9b0c-42a9-4562-a03a-27dca913446a nodeName:}" failed. No retries permitted until 2026-04-23 13:33:32.038346997 +0000 UTC m=+66.193269447 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs") pod "network-metrics-daemon-dqcwj" (UID: "dc7a9b0c-42a9-4562-a03a-27dca913446a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:33:00.038500 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.038393 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert podName:bb994adb-00a6-4601-83e9-80e43ab53049 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:00.538375746 +0000 UTC m=+34.693298195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert") pod "ingress-canary-ggnhj" (UID: "bb994adb-00a6-4601-83e9-80e43ab53049") : secret "canary-serving-cert" not found Apr 23 13:33:00.052444 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:00.052412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpdmd\" (UniqueName: \"kubernetes.io/projected/bb994adb-00a6-4601-83e9-80e43ab53049-kube-api-access-wpdmd\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:33:00.139130 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:00.139084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:33:00.139274 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.139247 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:33:00.139274 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.139270 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:33:00.139358 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.139281 2569 projected.go:194] Error preparing data for projected volume kube-api-access-pmpjf for pod openshift-network-diagnostics/network-check-target-zw7vm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:33:00.139358 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.139335 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf podName:32951250-c04f-4a66-a62c-e1372b1c84d0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:32.139321515 +0000 UTC m=+66.294243961 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-pmpjf" (UniqueName: "kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf") pod "network-check-target-zw7vm" (UID: "32951250-c04f-4a66-a62c-e1372b1c84d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:33:00.441039 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:00.440996 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:33:00.441589 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.441137 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:00.441589 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.441195 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls podName:33d8f26a-427d-4263-9b87-13337ac3a834 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:01.441177241 +0000 UTC m=+35.596099689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls") pod "dns-default-h8fxx" (UID: "33d8f26a-427d-4263-9b87-13337ac3a834") : secret "dns-default-metrics-tls" not found Apr 23 13:33:00.541985 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:00.541945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:33:00.542207 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.542118 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:00.542207 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:00.542202 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert podName:bb994adb-00a6-4601-83e9-80e43ab53049 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:01.542181175 +0000 UTC m=+35.697103632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert") pod "ingress-canary-ggnhj" (UID: "bb994adb-00a6-4601-83e9-80e43ab53049") : secret "canary-serving-cert" not found Apr 23 13:33:01.416477 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.416440 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:33:01.416669 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.416448 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:33:01.419993 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.419965 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:33:01.420146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.420008 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:33:01.420146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.419965 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-b4pgl\"" Apr 23 13:33:01.420146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.420030 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:33:01.420146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.420030 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rl8cq\"" Apr 23 13:33:01.448235 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.448199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:33:01.448625 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:01.448310 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:01.448625 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:01.448363 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls podName:33d8f26a-427d-4263-9b87-13337ac3a834 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:03.448347985 +0000 UTC m=+37.603270431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls") pod "dns-default-h8fxx" (UID: "33d8f26a-427d-4263-9b87-13337ac3a834") : secret "dns-default-metrics-tls" not found Apr 23 13:33:01.549404 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.549367 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:33:01.549572 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:01.549517 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:01.549618 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:01.549591 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert podName:bb994adb-00a6-4601-83e9-80e43ab53049 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:03.54957515 +0000 UTC m=+37.704497595 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert") pod "ingress-canary-ggnhj" (UID: "bb994adb-00a6-4601-83e9-80e43ab53049") : secret "canary-serving-cert" not found Apr 23 13:33:01.573163 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.573128 2569 generic.go:358] "Generic (PLEG): container finished" podID="e29be9aa-ef19-4770-b277-bce09909acde" containerID="bc59d9292b037ae583cd1abeab796cf12d31fbc2e7c475cd36160042c8a6f2b8" exitCode=0 Apr 23 13:33:01.573318 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:01.573191 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" event={"ID":"e29be9aa-ef19-4770-b277-bce09909acde","Type":"ContainerDied","Data":"bc59d9292b037ae583cd1abeab796cf12d31fbc2e7c475cd36160042c8a6f2b8"} Apr 23 13:33:02.577594 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:02.577559 2569 generic.go:358] "Generic (PLEG): container finished" podID="e29be9aa-ef19-4770-b277-bce09909acde" containerID="cb1c5cd5b5d595e56d4ee69451c116f67ed483e3d96f0537ef76a86d1d2c4ec9" exitCode=0 Apr 23 13:33:02.577983 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:02.577634 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" event={"ID":"e29be9aa-ef19-4770-b277-bce09909acde","Type":"ContainerDied","Data":"cb1c5cd5b5d595e56d4ee69451c116f67ed483e3d96f0537ef76a86d1d2c4ec9"} Apr 23 13:33:03.462019 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:03.461820 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:33:03.462217 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:03.461976 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:03.462217 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:03.462110 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls podName:33d8f26a-427d-4263-9b87-13337ac3a834 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:07.462094593 +0000 UTC m=+41.617017039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls") pod "dns-default-h8fxx" (UID: "33d8f26a-427d-4263-9b87-13337ac3a834") : secret "dns-default-metrics-tls" not found Apr 23 13:33:03.562586 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:03.562541 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:33:03.562721 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:03.562685 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:03.562774 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:03.562747 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert podName:bb994adb-00a6-4601-83e9-80e43ab53049 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:07.562732426 +0000 UTC m=+41.717654876 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert") pod "ingress-canary-ggnhj" (UID: "bb994adb-00a6-4601-83e9-80e43ab53049") : secret "canary-serving-cert" not found Apr 23 13:33:03.582278 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:03.582242 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" event={"ID":"e29be9aa-ef19-4770-b277-bce09909acde","Type":"ContainerStarted","Data":"032f88effc2ae5f6f2b3de80739237c7655c37a1e0d374d5fcba6affc1de6be7"} Apr 23 13:33:03.606257 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:03.606203 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6lkhk" podStartSLOduration=6.235374871 podStartE2EDuration="37.606188281s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:32:29.099513684 +0000 UTC m=+3.254436145" lastFinishedPulling="2026-04-23 13:33:00.470327109 +0000 UTC m=+34.625249555" observedRunningTime="2026-04-23 13:33:03.605804639 +0000 UTC m=+37.760727108" watchObservedRunningTime="2026-04-23 13:33:03.606188281 +0000 UTC m=+37.761110760" Apr 23 13:33:07.489190 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:07.489153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:33:07.489590 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:07.489277 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:07.489590 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:07.489328 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls podName:33d8f26a-427d-4263-9b87-13337ac3a834 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:15.489315128 +0000 UTC m=+49.644237574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls") pod "dns-default-h8fxx" (UID: "33d8f26a-427d-4263-9b87-13337ac3a834") : secret "dns-default-metrics-tls" not found Apr 23 13:33:07.590203 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:07.590171 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:33:07.590352 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:07.590290 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:07.590352 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:07.590338 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert podName:bb994adb-00a6-4601-83e9-80e43ab53049 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:15.590324747 +0000 UTC m=+49.745247194 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert") pod "ingress-canary-ggnhj" (UID: "bb994adb-00a6-4601-83e9-80e43ab53049") : secret "canary-serving-cert" not found Apr 23 13:33:15.546338 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:15.546294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:33:15.546856 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:15.546421 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:15.546856 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:15.546474 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls podName:33d8f26a-427d-4263-9b87-13337ac3a834 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:31.546460078 +0000 UTC m=+65.701382525 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls") pod "dns-default-h8fxx" (UID: "33d8f26a-427d-4263-9b87-13337ac3a834") : secret "dns-default-metrics-tls" not found Apr 23 13:33:15.646986 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:15.646947 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:33:15.647150 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:15.647133 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:15.647212 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:15.647201 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert podName:bb994adb-00a6-4601-83e9-80e43ab53049 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:31.647182207 +0000 UTC m=+65.802104654 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert") pod "ingress-canary-ggnhj" (UID: "bb994adb-00a6-4601-83e9-80e43ab53049") : secret "canary-serving-cert" not found Apr 23 13:33:25.569309 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:25.569275 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6dwj" Apr 23 13:33:31.557160 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:31.557112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:33:31.557540 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:31.557275 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:31.557540 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:31.557339 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls podName:33d8f26a-427d-4263-9b87-13337ac3a834 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:03.557322766 +0000 UTC m=+97.712245212 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls") pod "dns-default-h8fxx" (UID: "33d8f26a-427d-4263-9b87-13337ac3a834") : secret "dns-default-metrics-tls" not found Apr 23 13:33:31.657505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:31.657468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:33:31.657668 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:31.657586 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:31.657668 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:31.657641 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert podName:bb994adb-00a6-4601-83e9-80e43ab53049 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:03.657626062 +0000 UTC m=+97.812548523 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert") pod "ingress-canary-ggnhj" (UID: "bb994adb-00a6-4601-83e9-80e43ab53049") : secret "canary-serving-cert" not found Apr 23 13:33:32.060838 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.060795 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:33:32.063656 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.063632 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:33:32.071671 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:32.071645 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:33:32.071728 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:32.071708 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs podName:dc7a9b0c-42a9-4562-a03a-27dca913446a nodeName:}" failed. No retries permitted until 2026-04-23 13:34:36.071691738 +0000 UTC m=+130.226614184 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs") pod "network-metrics-daemon-dqcwj" (UID: "dc7a9b0c-42a9-4562-a03a-27dca913446a") : secret "metrics-daemon-secret" not found Apr 23 13:33:32.161542 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.161501 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:33:32.164662 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.164644 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:33:32.174884 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.174862 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:33:32.186473 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.186439 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmpjf\" (UniqueName: \"kubernetes.io/projected/32951250-c04f-4a66-a62c-e1372b1c84d0-kube-api-access-pmpjf\") pod \"network-check-target-zw7vm\" (UID: \"32951250-c04f-4a66-a62c-e1372b1c84d0\") " pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:33:32.332614 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.332532 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-b4pgl\"" Apr 23 13:33:32.340503 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.340479 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:33:32.500117 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.500086 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zw7vm"] Apr 23 13:33:32.503502 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:33:32.503472 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32951250_c04f_4a66_a62c_e1372b1c84d0.slice/crio-4a90a07cb8a9c95821f78b9a485dc3e83f96aa3e49af6dddc3d8e9e4580e85bf WatchSource:0}: Error finding container 4a90a07cb8a9c95821f78b9a485dc3e83f96aa3e49af6dddc3d8e9e4580e85bf: Status 404 returned error can't find the container with id 4a90a07cb8a9c95821f78b9a485dc3e83f96aa3e49af6dddc3d8e9e4580e85bf Apr 23 13:33:32.647112 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:32.647008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zw7vm" event={"ID":"32951250-c04f-4a66-a62c-e1372b1c84d0","Type":"ContainerStarted","Data":"4a90a07cb8a9c95821f78b9a485dc3e83f96aa3e49af6dddc3d8e9e4580e85bf"} Apr 23 13:33:35.654375 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:35.654338 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zw7vm" event={"ID":"32951250-c04f-4a66-a62c-e1372b1c84d0","Type":"ContainerStarted","Data":"c6aad701431c704eecb1d97e0e420307512c6cc26c5abb7f5f031b97a90ba508"} Apr 23 13:33:35.654770 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:35.654459 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:33:35.668725 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:35.668675 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zw7vm" podStartSLOduration=66.989368305 podStartE2EDuration="1m9.668659735s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:33:32.505421808 +0000 UTC m=+66.660344256" lastFinishedPulling="2026-04-23 13:33:35.184713239 +0000 UTC m=+69.339635686" observedRunningTime="2026-04-23 13:33:35.668644575 +0000 UTC m=+69.823567040" watchObservedRunningTime="2026-04-23 13:33:35.668659735 +0000 UTC m=+69.823582181" Apr 23 13:33:53.257028 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.256861 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn"] Apr 23 13:33:53.261165 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.261139 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4666z"] Apr 23 13:33:53.261318 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.261271 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn" Apr 23 13:33:53.263741 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.263716 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:53.263860 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.263832 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.264536 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.264516 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:53.264638 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.264517 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-47lbz\"" Apr 23 13:33:53.266014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.265995 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:53.266153 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.266090 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 13:33:53.266153 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.266101 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 13:33:53.266153 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.266117 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-rgws4\"" Apr 23 13:33:53.266282 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.266211 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:53.268414 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.268387 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn"] Apr 23 13:33:53.271599 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.271572 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4666z"] Apr 23 13:33:53.272391 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.272371 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 13:33:53.355873 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.355843 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv"] Apr 23 13:33:53.358700 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.358684 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:53.361211 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.361182 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:33:53.361211 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.361204 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:33:53.361373 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.361210 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 13:33:53.362383 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.362368 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-m9r5r\"" Apr 23 13:33:53.362383 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.362377 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 13:33:53.365039 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.365015 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5kw89"] Apr 23 13:33:53.370922 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.370891 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc"] Apr 23 13:33:53.371105 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.370992 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.373170 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.373147 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 13:33:53.373444 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.373425 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:33:53.373528 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.373425 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:33:53.373528 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.373482 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-6kdmx\"" Apr 23 13:33:53.373634 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.373426 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 13:33:53.373967 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.373950 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5484bb4458-jcmv9"] Apr 23 13:33:53.374087 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.374045 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.376236 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.376217 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 13:33:53.376419 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.376400 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 13:33:53.376556 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.376537 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:53.376909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.376893 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv"] Apr 23 13:33:53.377496 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.377479 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:53.377737 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.377565 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.378094 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.378048 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-8fmhc\"" Apr 23 13:33:53.380160 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.380137 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 13:33:53.380261 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.380214 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nw8fz\"" Apr 23 13:33:53.380261 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.380249 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc"] Apr 23 13:33:53.380716 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.380700 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:33:53.380899 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.380885 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:33:53.381432 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.381415 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5kw89"] Apr 23 13:33:53.381579 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.381563 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:33:53.382485 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.382469 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5484bb4458-jcmv9"] Apr 23 13:33:53.387924 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.387901 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:33:53.412520 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.412488 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmkvf\" (UniqueName: \"kubernetes.io/projected/55eb7ea0-9b60-41aa-9e7e-2ccf55ef5388-kube-api-access-tmkvf\") pod \"volume-data-source-validator-7c6cbb6c87-dgfsn\" (UID: \"55eb7ea0-9b60-41aa-9e7e-2ccf55ef5388\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn" Apr 23 13:33:53.412717 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.412540 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af8933e-b7d0-4a15-a43e-c2a76d750555-serving-cert\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.412717 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.412571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af8933e-b7d0-4a15-a43e-c2a76d750555-config\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.412717 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.412594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4v6m\" (UniqueName: \"kubernetes.io/projected/2af8933e-b7d0-4a15-a43e-c2a76d750555-kube-api-access-j4v6m\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.412904 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.412751 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2af8933e-b7d0-4a15-a43e-c2a76d750555-trusted-ca\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.462602 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.462561 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq"] Apr 23 13:33:53.465515 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.465489 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x"] Apr 23 13:33:53.465679 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.465640 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.468300 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.468276 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5c4d58d76d-p9647"] Apr 23 13:33:53.468441 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.468427 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:33:53.469220 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.469199 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 13:33:53.469333 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.469196 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:53.469333 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.469204 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:53.469333 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.469271 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 13:33:53.469547 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.469529 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-fbstr\"" Apr 23 13:33:53.470659 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.470637 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 13:33:53.470798 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.470693 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:53.470798 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.470740 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:53.470918 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.470846 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-lzs9m\"" Apr 23 13:33:53.471144 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.471123 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.474234 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.474216 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 13:33:53.475344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.474320 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 13:33:53.475344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.474359 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 13:33:53.475344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.474404 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 13:33:53.475344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.474454 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-n5rp9\"" Apr 23 13:33:53.475344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.474455 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 13:33:53.475344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.474456 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 13:33:53.476170 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.476151 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x"] Apr 23 13:33:53.477436 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.477417 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq"] Apr 23 13:33:53.495519 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.495488 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5c4d58d76d-p9647"] Apr 23 13:33:53.513519 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4f65a5c-dbc0-4b33-825f-41c16ff92077-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.513692 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513515 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqq9t\" (UniqueName: \"kubernetes.io/projected/c4f65a5c-dbc0-4b33-825f-41c16ff92077-kube-api-access-kqq9t\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.513692 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513557 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-image-registry-private-configuration\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.513692 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svbhs\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-kube-api-access-svbhs\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.513692 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmkvf\" (UniqueName: \"kubernetes.io/projected/55eb7ea0-9b60-41aa-9e7e-2ccf55ef5388-kube-api-access-tmkvf\") pod \"volume-data-source-validator-7c6cbb6c87-dgfsn\" (UID: \"55eb7ea0-9b60-41aa-9e7e-2ccf55ef5388\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn" Apr 23 13:33:53.513692 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513676 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dh4t\" (UniqueName: \"kubernetes.io/projected/83f8f286-0a1a-4047-8e3d-83c4b68f2209-kube-api-access-2dh4t\") pod \"kube-storage-version-migrator-operator-6769c5d45-xq8kc\" (UID: \"83f8f286-0a1a-4047-8e3d-83c4b68f2209\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.513939 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/deda38e5-8a52-4797-a3fa-938eb8704a37-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:53.513939 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513745 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71703531-7aaa-494c-9ad6-e9b94573df76-ca-trust-extracted\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.513939 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513791 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f8f286-0a1a-4047-8e3d-83c4b68f2209-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xq8kc\" (UID: \"83f8f286-0a1a-4047-8e3d-83c4b68f2209\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.513939 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513829 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f8f286-0a1a-4047-8e3d-83c4b68f2209-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xq8kc\" (UID: \"83f8f286-0a1a-4047-8e3d-83c4b68f2209\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.513939 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513885 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af8933e-b7d0-4a15-a43e-c2a76d750555-config\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.513939 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513912 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:53.514262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513940 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4cpt\" (UniqueName: \"kubernetes.io/projected/deda38e5-8a52-4797-a3fa-938eb8704a37-kube-api-access-k4cpt\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:53.514262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.513980 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.514262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4f65a5c-dbc0-4b33-825f-41c16ff92077-tmp\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.514262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2af8933e-b7d0-4a15-a43e-c2a76d750555-trusted-ca\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.514262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514111 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4f65a5c-dbc0-4b33-825f-41c16ff92077-service-ca-bundle\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.514262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-registry-certificates\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.514262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af8933e-b7d0-4a15-a43e-c2a76d750555-serving-cert\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.514262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514250 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-installation-pull-secrets\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.514623 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4v6m\" (UniqueName: \"kubernetes.io/projected/2af8933e-b7d0-4a15-a43e-c2a76d750555-kube-api-access-j4v6m\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.514623 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514370 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-trusted-ca\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.514623 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514413 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c4f65a5c-dbc0-4b33-825f-41c16ff92077-snapshots\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.514623 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514440 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f65a5c-dbc0-4b33-825f-41c16ff92077-serving-cert\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.514623 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.514476 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-bound-sa-token\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.515705 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.515679 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af8933e-b7d0-4a15-a43e-c2a76d750555-config\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.516663 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.516634 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2af8933e-b7d0-4a15-a43e-c2a76d750555-trusted-ca\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.519712 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.519687 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af8933e-b7d0-4a15-a43e-c2a76d750555-serving-cert\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.526715 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.526689 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4v6m\" (UniqueName: \"kubernetes.io/projected/2af8933e-b7d0-4a15-a43e-c2a76d750555-kube-api-access-j4v6m\") pod \"console-operator-9d4b6777b-4666z\" (UID: \"2af8933e-b7d0-4a15-a43e-c2a76d750555\") " pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.527407 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.527387 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmkvf\" (UniqueName: \"kubernetes.io/projected/55eb7ea0-9b60-41aa-9e7e-2ccf55ef5388-kube-api-access-tmkvf\") pod \"volume-data-source-validator-7c6cbb6c87-dgfsn\" (UID: \"55eb7ea0-9b60-41aa-9e7e-2ccf55ef5388\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn" Apr 23 13:33:53.572697 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.572656 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn" Apr 23 13:33:53.578498 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.578468 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:33:53.615339 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615288 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.615339 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-trusted-ca\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615363 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q76m6\" (UniqueName: \"kubernetes.io/projected/5cb70f7a-9760-40d9-b08b-b8115fb6bdf2-kube-api-access-q76m6\") pod \"service-ca-operator-d6fc45fc5-d2zwq\" (UID: \"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615395 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c4f65a5c-dbc0-4b33-825f-41c16ff92077-snapshots\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-stats-auth\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615433 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f65a5c-dbc0-4b33-825f-41c16ff92077-serving-cert\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615448 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-bound-sa-token\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615481 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4f65a5c-dbc0-4b33-825f-41c16ff92077-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615510 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqq9t\" (UniqueName: \"kubernetes.io/projected/c4f65a5c-dbc0-4b33-825f-41c16ff92077-kube-api-access-kqq9t\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-image-registry-private-configuration\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svbhs\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-kube-api-access-svbhs\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.615600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615593 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dh4t\" (UniqueName: \"kubernetes.io/projected/83f8f286-0a1a-4047-8e3d-83c4b68f2209-kube-api-access-2dh4t\") pod \"kube-storage-version-migrator-operator-6769c5d45-xq8kc\" (UID: \"83f8f286-0a1a-4047-8e3d-83c4b68f2209\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/deda38e5-8a52-4797-a3fa-938eb8704a37-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615644 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71703531-7aaa-494c-9ad6-e9b94573df76-ca-trust-extracted\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb70f7a-9760-40d9-b08b-b8115fb6bdf2-serving-cert\") pod \"service-ca-operator-d6fc45fc5-d2zwq\" (UID: \"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615723 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f8f286-0a1a-4047-8e3d-83c4b68f2209-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xq8kc\" (UID: \"83f8f286-0a1a-4047-8e3d-83c4b68f2209\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f8f286-0a1a-4047-8e3d-83c4b68f2209-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xq8kc\" (UID: \"83f8f286-0a1a-4047-8e3d-83c4b68f2209\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615769 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-default-certificate\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615818 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4cpt\" (UniqueName: \"kubernetes.io/projected/deda38e5-8a52-4797-a3fa-938eb8704a37-kube-api-access-k4cpt\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.615959 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.616001 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.616002 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616012 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616044 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4f65a5c-dbc0-4b33-825f-41c16ff92077-tmp\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.616079 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls podName:deda38e5-8a52-4797-a3fa-938eb8704a37 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:54.116040509 +0000 UTC m=+88.270962959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbhzv" (UID: "deda38e5-8a52-4797-a3fa-938eb8704a37") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616117 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4f65a5c-dbc0-4b33-825f-41c16ff92077-service-ca-bundle\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616149 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb70f7a-9760-40d9-b08b-b8115fb6bdf2-config\") pod \"service-ca-operator-d6fc45fc5-d2zwq\" (UID: \"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616166 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c4f65a5c-dbc0-4b33-825f-41c16ff92077-snapshots\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616207 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-registry-certificates\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616238 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbht\" (UniqueName: \"kubernetes.io/projected/c3812d73-c709-4f23-aa36-2623bc03faf0-kube-api-access-bzbht\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616272 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-installation-pull-secrets\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616300 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp2zd\" (UniqueName: \"kubernetes.io/projected/dd1d91d8-09b0-43ef-9971-5c19edba64a2-kube-api-access-lp2zd\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:33:53.616505 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4f65a5c-dbc0-4b33-825f-41c16ff92077-tmp\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.617370 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-trusted-ca\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.617370 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.616919 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f8f286-0a1a-4047-8e3d-83c4b68f2209-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xq8kc\" (UID: \"83f8f286-0a1a-4047-8e3d-83c4b68f2209\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.617477 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.617454 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-registry-certificates\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.617530 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.617471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/deda38e5-8a52-4797-a3fa-938eb8704a37-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:53.617644 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.617624 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:53.617713 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.617647 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5484bb4458-jcmv9: secret "image-registry-tls" not found Apr 23 13:33:53.617713 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.617710 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls podName:71703531-7aaa-494c-9ad6-e9b94573df76 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:54.117693402 +0000 UTC m=+88.272615856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls") pod "image-registry-5484bb4458-jcmv9" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76") : secret "image-registry-tls" not found Apr 23 13:33:53.617997 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.617970 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4f65a5c-dbc0-4b33-825f-41c16ff92077-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.617997 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.617993 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4f65a5c-dbc0-4b33-825f-41c16ff92077-service-ca-bundle\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.618202 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.618014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71703531-7aaa-494c-9ad6-e9b94573df76-ca-trust-extracted\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.618874 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.618841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f8f286-0a1a-4047-8e3d-83c4b68f2209-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xq8kc\" (UID: \"83f8f286-0a1a-4047-8e3d-83c4b68f2209\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.619340 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.619309 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f65a5c-dbc0-4b33-825f-41c16ff92077-serving-cert\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.619873 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.619709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-image-registry-private-configuration\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.621670 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.621622 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-installation-pull-secrets\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.626783 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.626715 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dh4t\" (UniqueName: \"kubernetes.io/projected/83f8f286-0a1a-4047-8e3d-83c4b68f2209-kube-api-access-2dh4t\") pod \"kube-storage-version-migrator-operator-6769c5d45-xq8kc\" (UID: \"83f8f286-0a1a-4047-8e3d-83c4b68f2209\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.628568 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.628514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqq9t\" (UniqueName: \"kubernetes.io/projected/c4f65a5c-dbc0-4b33-825f-41c16ff92077-kube-api-access-kqq9t\") pod \"insights-operator-585dfdc468-5kw89\" (UID: \"c4f65a5c-dbc0-4b33-825f-41c16ff92077\") " pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.629690 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.629417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-bound-sa-token\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.629690 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.629642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4cpt\" (UniqueName: \"kubernetes.io/projected/deda38e5-8a52-4797-a3fa-938eb8704a37-kube-api-access-k4cpt\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:53.630105 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.630086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svbhs\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-kube-api-access-svbhs\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:53.684493 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.684452 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5kw89" Apr 23 13:33:53.691166 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.691044 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" Apr 23 13:33:53.705193 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.705153 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn"] Apr 23 13:33:53.708989 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:33:53.708949 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55eb7ea0_9b60_41aa_9e7e_2ccf55ef5388.slice/crio-42ec18a84918f90969790d76183c17f339754b5be38c802b9ec525714986bb5b WatchSource:0}: Error finding container 42ec18a84918f90969790d76183c17f339754b5be38c802b9ec525714986bb5b: Status 404 returned error can't find the container with id 42ec18a84918f90969790d76183c17f339754b5be38c802b9ec525714986bb5b Apr 23 13:33:53.717706 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717673 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.717830 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717734 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:33:53.717830 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb70f7a-9760-40d9-b08b-b8115fb6bdf2-config\") pod \"service-ca-operator-d6fc45fc5-d2zwq\" (UID: \"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.717830 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbht\" (UniqueName: \"kubernetes.io/projected/c3812d73-c709-4f23-aa36-2623bc03faf0-kube-api-access-bzbht\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.717830 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp2zd\" (UniqueName: \"kubernetes.io/projected/dd1d91d8-09b0-43ef-9971-5c19edba64a2-kube-api-access-lp2zd\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:33:53.717965 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717849 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.717965 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717871 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q76m6\" (UniqueName: \"kubernetes.io/projected/5cb70f7a-9760-40d9-b08b-b8115fb6bdf2-kube-api-access-q76m6\") pod \"service-ca-operator-d6fc45fc5-d2zwq\" (UID: \"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.717965 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717888 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-stats-auth\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.717965 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb70f7a-9760-40d9-b08b-b8115fb6bdf2-serving-cert\") pod \"service-ca-operator-d6fc45fc5-d2zwq\" (UID: \"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.717965 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.717949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-default-certificate\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.719286 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.718256 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:53.719286 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.718316 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls podName:dd1d91d8-09b0-43ef-9971-5c19edba64a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:54.218299996 +0000 UTC m=+88.373222441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c228x" (UID: "dd1d91d8-09b0-43ef-9971-5c19edba64a2") : secret "samples-operator-tls" not found Apr 23 13:33:53.719286 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.718611 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:54.218567888 +0000 UTC m=+88.373490334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:53.719286 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.718950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb70f7a-9760-40d9-b08b-b8115fb6bdf2-config\") pod \"service-ca-operator-d6fc45fc5-d2zwq\" (UID: \"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.719286 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.718986 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:53.719286 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:53.719030 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:54.219015796 +0000 UTC m=+88.373938242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : secret "router-metrics-certs-default" not found Apr 23 13:33:53.721980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.721918 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb70f7a-9760-40d9-b08b-b8115fb6bdf2-serving-cert\") pod \"service-ca-operator-d6fc45fc5-d2zwq\" (UID: \"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.722142 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.722121 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-stats-auth\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.722238 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.722215 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-default-certificate\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.723665 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.723643 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4666z"] Apr 23 13:33:53.726704 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:33:53.726665 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af8933e_b7d0_4a15_a43e_c2a76d750555.slice/crio-9d212d4337c302ea338a56b17f7aa3d293c49fc4a35ed33b68f381bb11ee0649 WatchSource:0}: Error finding container 9d212d4337c302ea338a56b17f7aa3d293c49fc4a35ed33b68f381bb11ee0649: Status 404 returned error can't find the container with id 9d212d4337c302ea338a56b17f7aa3d293c49fc4a35ed33b68f381bb11ee0649 Apr 23 13:33:53.727284 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.727251 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbht\" (UniqueName: \"kubernetes.io/projected/c3812d73-c709-4f23-aa36-2623bc03faf0-kube-api-access-bzbht\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:53.727565 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.727544 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q76m6\" (UniqueName: \"kubernetes.io/projected/5cb70f7a-9760-40d9-b08b-b8115fb6bdf2-kube-api-access-q76m6\") pod \"service-ca-operator-d6fc45fc5-d2zwq\" (UID: \"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.727680 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.727660 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp2zd\" (UniqueName: \"kubernetes.io/projected/dd1d91d8-09b0-43ef-9971-5c19edba64a2-kube-api-access-lp2zd\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:33:53.776660 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.776566 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" Apr 23 13:33:53.830472 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.829838 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc"] Apr 23 13:33:53.835798 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:33:53.835738 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83f8f286_0a1a_4047_8e3d_83c4b68f2209.slice/crio-1fdd393c130318ddf8da119d6a7e5af5327c173d7ca9d82bce131cd43ec91348 WatchSource:0}: Error finding container 1fdd393c130318ddf8da119d6a7e5af5327c173d7ca9d82bce131cd43ec91348: Status 404 returned error can't find the container with id 1fdd393c130318ddf8da119d6a7e5af5327c173d7ca9d82bce131cd43ec91348 Apr 23 13:33:53.846543 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.846517 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5kw89"] Apr 23 13:33:53.850546 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:33:53.850498 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4f65a5c_dbc0_4b33_825f_41c16ff92077.slice/crio-5443118999f4b574f15abadd6d23516cefe53b8e388cba527be0d78245502506 WatchSource:0}: Error finding container 5443118999f4b574f15abadd6d23516cefe53b8e388cba527be0d78245502506: Status 404 returned error can't find the container with id 5443118999f4b574f15abadd6d23516cefe53b8e388cba527be0d78245502506 Apr 23 13:33:53.904257 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:53.904224 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq"] Apr 23 13:33:53.907365 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:33:53.907337 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cb70f7a_9760_40d9_b08b_b8115fb6bdf2.slice/crio-eba30a22f7c71dd4579df8dc401273435b52bca5eeadab6084c6b8aed99a5aca WatchSource:0}: Error finding container eba30a22f7c71dd4579df8dc401273435b52bca5eeadab6084c6b8aed99a5aca: Status 404 returned error can't find the container with id eba30a22f7c71dd4579df8dc401273435b52bca5eeadab6084c6b8aed99a5aca Apr 23 13:33:54.121864 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.121771 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:54.121864 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.121816 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:54.122036 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.121908 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:54.122036 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.121922 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5484bb4458-jcmv9: secret "image-registry-tls" not found Apr 23 13:33:54.122036 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.121925 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:54.122036 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.121971 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls podName:71703531-7aaa-494c-9ad6-e9b94573df76 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:55.121958023 +0000 UTC m=+89.276880469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls") pod "image-registry-5484bb4458-jcmv9" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76") : secret "image-registry-tls" not found Apr 23 13:33:54.122036 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.121987 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls podName:deda38e5-8a52-4797-a3fa-938eb8704a37 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:55.121979244 +0000 UTC m=+89.276901690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbhzv" (UID: "deda38e5-8a52-4797-a3fa-938eb8704a37") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:54.223250 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.223209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:54.223448 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.223286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:33:54.223448 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.223366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:54.223556 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.223520 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:54.223607 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.223572 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:55.223557948 +0000 UTC m=+89.378480394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : secret "router-metrics-certs-default" not found Apr 23 13:33:54.224077 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.224038 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:55.224021099 +0000 UTC m=+89.378943551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:54.224209 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.224189 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:54.224268 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:54.224254 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls podName:dd1d91d8-09b0-43ef-9971-5c19edba64a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:55.224235253 +0000 UTC m=+89.379157702 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c228x" (UID: "dd1d91d8-09b0-43ef-9971-5c19edba64a2") : secret "samples-operator-tls" not found Apr 23 13:33:54.696987 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.696933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn" event={"ID":"55eb7ea0-9b60-41aa-9e7e-2ccf55ef5388","Type":"ContainerStarted","Data":"42ec18a84918f90969790d76183c17f339754b5be38c802b9ec525714986bb5b"} Apr 23 13:33:54.699637 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.699570 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" event={"ID":"83f8f286-0a1a-4047-8e3d-83c4b68f2209","Type":"ContainerStarted","Data":"1fdd393c130318ddf8da119d6a7e5af5327c173d7ca9d82bce131cd43ec91348"} Apr 23 13:33:54.702038 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.701994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" event={"ID":"2af8933e-b7d0-4a15-a43e-c2a76d750555","Type":"ContainerStarted","Data":"9d212d4337c302ea338a56b17f7aa3d293c49fc4a35ed33b68f381bb11ee0649"} Apr 23 13:33:54.711788 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.711750 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" event={"ID":"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2","Type":"ContainerStarted","Data":"eba30a22f7c71dd4579df8dc401273435b52bca5eeadab6084c6b8aed99a5aca"} Apr 23 13:33:54.713434 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.713399 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5kw89" event={"ID":"c4f65a5c-dbc0-4b33-825f-41c16ff92077","Type":"ContainerStarted","Data":"5443118999f4b574f15abadd6d23516cefe53b8e388cba527be0d78245502506"} Apr 23 13:33:54.954284 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.954186 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b"] Apr 23 13:33:54.961168 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.960497 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:33:54.965597 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.964724 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xs4qs\"" Apr 23 13:33:54.965597 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.965130 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 13:33:54.965597 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.965411 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 13:33:54.969534 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:54.969459 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b"] Apr 23 13:33:55.132456 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.131352 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3f8dae65-c604-469a-abf3-bc8ac066bcd8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:33:55.132456 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.131446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:55.132456 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.131482 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:55.132456 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.131557 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:33:55.132456 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.131825 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:55.132456 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.131905 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls podName:deda38e5-8a52-4797-a3fa-938eb8704a37 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:57.131875651 +0000 UTC m=+91.286798105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbhzv" (UID: "deda38e5-8a52-4797-a3fa-938eb8704a37") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:55.132456 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.132354 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:55.132456 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.132368 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5484bb4458-jcmv9: secret "image-registry-tls" not found Apr 23 13:33:55.132456 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.132422 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls podName:71703531-7aaa-494c-9ad6-e9b94573df76 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:57.132395733 +0000 UTC m=+91.287318183 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls") pod "image-registry-5484bb4458-jcmv9" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76") : secret "image-registry-tls" not found Apr 23 13:33:55.232518 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.232433 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:55.232518 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.232502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:33:55.232732 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.232560 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:33:55.232732 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.232619 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:55.232732 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.232687 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3f8dae65-c604-469a-abf3-bc8ac066bcd8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:33:55.233695 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.233667 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3f8dae65-c604-469a-abf3-bc8ac066bcd8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:33:55.233828 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.233813 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:57.233779074 +0000 UTC m=+91.388701535 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:55.234357 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.234339 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:55.234456 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.234390 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls podName:dd1d91d8-09b0-43ef-9971-5c19edba64a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:57.234374551 +0000 UTC m=+91.389297012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c228x" (UID: "dd1d91d8-09b0-43ef-9971-5c19edba64a2") : secret "samples-operator-tls" not found Apr 23 13:33:55.234562 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.234545 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:33:55.234619 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.234608 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert podName:3f8dae65-c604-469a-abf3-bc8ac066bcd8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:55.734585185 +0000 UTC m=+89.889507641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ghb6b" (UID: "3f8dae65-c604-469a-abf3-bc8ac066bcd8") : secret "networking-console-plugin-cert" not found Apr 23 13:33:55.234688 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.234681 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:55.234793 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.234781 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:57.234731855 +0000 UTC m=+91.389654315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : secret "router-metrics-certs-default" not found Apr 23 13:33:55.736928 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:55.736882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:33:55.737362 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.737073 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:33:55.737362 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:55.737153 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert podName:3f8dae65-c604-469a-abf3-bc8ac066bcd8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:56.737129744 +0000 UTC m=+90.892052191 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ghb6b" (UID: "3f8dae65-c604-469a-abf3-bc8ac066bcd8") : secret "networking-console-plugin-cert" not found Apr 23 13:33:56.745280 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:56.745236 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:33:56.745757 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:56.745392 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:33:56.745757 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:56.745465 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert podName:3f8dae65-c604-469a-abf3-bc8ac066bcd8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:58.745447492 +0000 UTC m=+92.900369955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ghb6b" (UID: "3f8dae65-c604-469a-abf3-bc8ac066bcd8") : secret "networking-console-plugin-cert" not found Apr 23 13:33:57.149575 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.149530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:33:57.149575 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.149581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:33:57.149845 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.149713 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:57.149845 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.149765 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:57.149845 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.149784 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5484bb4458-jcmv9: secret "image-registry-tls" not found Apr 23 13:33:57.149845 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.149799 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls podName:deda38e5-8a52-4797-a3fa-938eb8704a37 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:01.149777212 +0000 UTC m=+95.304699673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbhzv" (UID: "deda38e5-8a52-4797-a3fa-938eb8704a37") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:57.149845 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.149827 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls podName:71703531-7aaa-494c-9ad6-e9b94573df76 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:01.149814607 +0000 UTC m=+95.304737053 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls") pod "image-registry-5484bb4458-jcmv9" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76") : secret "image-registry-tls" not found Apr 23 13:33:57.250789 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.250744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:57.250957 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.250858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:33:57.250957 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.250913 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:33:57.250957 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.250914 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:57.251123 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.251000 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:01.25098395 +0000 UTC m=+95.405906398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:57.251123 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.251004 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:57.251123 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.251025 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:01.251016942 +0000 UTC m=+95.405939388 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : secret "router-metrics-certs-default" not found Apr 23 13:33:57.251123 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:57.251069 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls podName:dd1d91d8-09b0-43ef-9971-5c19edba64a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:01.251039993 +0000 UTC m=+95.405962443 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c228x" (UID: "dd1d91d8-09b0-43ef-9971-5c19edba64a2") : secret "samples-operator-tls" not found Apr 23 13:33:57.722225 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.722107 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5kw89" event={"ID":"c4f65a5c-dbc0-4b33-825f-41c16ff92077","Type":"ContainerStarted","Data":"e0e90cb7d24fba978449d54ff4d577849e2e270f88acf2d3b0bb08b1d3eabdbd"} Apr 23 13:33:57.723561 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.723530 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn" event={"ID":"55eb7ea0-9b60-41aa-9e7e-2ccf55ef5388","Type":"ContainerStarted","Data":"9ab16bbae38d45d3ee19f622bab2d926976f9506ab05e8dc41d9f927c6569b4d"} Apr 23 13:33:57.724815 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.724771 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" event={"ID":"83f8f286-0a1a-4047-8e3d-83c4b68f2209","Type":"ContainerStarted","Data":"1e2e364793d9472f7c8968170365202c683d568fd9c17db0485913ce215a1bb4"} Apr 23 13:33:57.726410 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.726392 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/0.log" Apr 23 13:33:57.726516 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.726427 2569 generic.go:358] "Generic (PLEG): container finished" podID="2af8933e-b7d0-4a15-a43e-c2a76d750555" containerID="184a6ea5f6a6bc51957c81e1140e471129d5e8d60b1558ed0829f83847019fd6" exitCode=255 Apr 23 13:33:57.726516 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.726489 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" event={"ID":"2af8933e-b7d0-4a15-a43e-c2a76d750555","Type":"ContainerDied","Data":"184a6ea5f6a6bc51957c81e1140e471129d5e8d60b1558ed0829f83847019fd6"} Apr 23 13:33:57.726704 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.726682 2569 scope.go:117] "RemoveContainer" containerID="184a6ea5f6a6bc51957c81e1140e471129d5e8d60b1558ed0829f83847019fd6" Apr 23 13:33:57.727929 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.727897 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" event={"ID":"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2","Type":"ContainerStarted","Data":"ca3dba1e0f14c749ccb8fc852d1e73dcd62e902c2a57eeebaf8fb77fc573d79e"} Apr 23 13:33:57.745337 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.740696 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-5kw89" podStartSLOduration=1.162880373 podStartE2EDuration="4.740678914s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="2026-04-23 13:33:53.852797355 +0000 UTC m=+88.007719801" lastFinishedPulling="2026-04-23 13:33:57.430595893 +0000 UTC m=+91.585518342" observedRunningTime="2026-04-23 13:33:57.738440682 +0000 UTC m=+91.893363151" watchObservedRunningTime="2026-04-23 13:33:57.740678914 +0000 UTC m=+91.895601388" Apr 23 13:33:57.760894 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.760842 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" podStartSLOduration=1.241413976 podStartE2EDuration="4.760823406s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="2026-04-23 13:33:53.909135006 +0000 UTC m=+88.064057451" lastFinishedPulling="2026-04-23 13:33:57.428544433 +0000 UTC m=+91.583466881" observedRunningTime="2026-04-23 13:33:57.759142057 +0000 UTC m=+91.914064518" watchObservedRunningTime="2026-04-23 13:33:57.760823406 +0000 UTC m=+91.915745875" Apr 23 13:33:57.778962 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.778908 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" podStartSLOduration=1.181795677 podStartE2EDuration="4.778890747s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="2026-04-23 13:33:53.839917014 +0000 UTC m=+87.994839474" lastFinishedPulling="2026-04-23 13:33:57.437012083 +0000 UTC m=+91.591934544" observedRunningTime="2026-04-23 13:33:57.778266001 +0000 UTC m=+91.933188478" watchObservedRunningTime="2026-04-23 13:33:57.778890747 +0000 UTC m=+91.933813223" Apr 23 13:33:57.793562 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:57.793304 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dgfsn" podStartSLOduration=1.083437129 podStartE2EDuration="4.79328659s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="2026-04-23 13:33:53.711051601 +0000 UTC m=+87.865974048" lastFinishedPulling="2026-04-23 13:33:57.420901062 +0000 UTC m=+91.575823509" observedRunningTime="2026-04-23 13:33:57.792684837 +0000 UTC m=+91.947607305" watchObservedRunningTime="2026-04-23 13:33:57.79328659 +0000 UTC m=+91.948209053" Apr 23 13:33:58.731228 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:58.731154 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/1.log" Apr 23 13:33:58.731564 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:58.731546 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/0.log" Apr 23 13:33:58.731613 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:58.731584 2569 generic.go:358] "Generic (PLEG): container finished" podID="2af8933e-b7d0-4a15-a43e-c2a76d750555" containerID="5d293b33840c05358db6930c1af18f7202171050231407f77c06c701fd81f1fa" exitCode=255 Apr 23 13:33:58.731722 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:58.731695 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" event={"ID":"2af8933e-b7d0-4a15-a43e-c2a76d750555","Type":"ContainerDied","Data":"5d293b33840c05358db6930c1af18f7202171050231407f77c06c701fd81f1fa"} Apr 23 13:33:58.731785 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:58.731752 2569 scope.go:117] "RemoveContainer" containerID="184a6ea5f6a6bc51957c81e1140e471129d5e8d60b1558ed0829f83847019fd6" Apr 23 13:33:58.732022 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:58.731995 2569 scope.go:117] "RemoveContainer" containerID="5d293b33840c05358db6930c1af18f7202171050231407f77c06c701fd81f1fa" Apr 23 13:33:58.732264 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:58.732241 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4666z_openshift-console-operator(2af8933e-b7d0-4a15-a43e-c2a76d750555)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" podUID="2af8933e-b7d0-4a15-a43e-c2a76d750555" Apr 23 13:33:58.765614 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:58.765268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:33:58.766108 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:58.765633 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:33:58.766108 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:58.765697 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert podName:3f8dae65-c604-469a-abf3-bc8ac066bcd8 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:02.765677181 +0000 UTC m=+96.920599632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ghb6b" (UID: "3f8dae65-c604-469a-abf3-bc8ac066bcd8") : secret "networking-console-plugin-cert" not found Apr 23 13:33:59.737291 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:59.737257 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/1.log" Apr 23 13:33:59.737628 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:33:59.737613 2569 scope.go:117] "RemoveContainer" containerID="5d293b33840c05358db6930c1af18f7202171050231407f77c06c701fd81f1fa" Apr 23 13:33:59.737788 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:33:59.737771 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4666z_openshift-console-operator(2af8933e-b7d0-4a15-a43e-c2a76d750555)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" podUID="2af8933e-b7d0-4a15-a43e-c2a76d750555" Apr 23 13:34:00.682207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:00.682174 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nzrks_61372c64-9070-4751-b720-a4016030cf02/dns-node-resolver/0.log" Apr 23 13:34:01.187796 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.187749 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:34:01.187796 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.187796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:34:01.188096 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.187908 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:34:01.188096 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.187918 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5484bb4458-jcmv9: secret "image-registry-tls" not found Apr 23 13:34:01.188096 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.187914 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:01.188096 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.187968 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls podName:71703531-7aaa-494c-9ad6-e9b94573df76 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:09.187954175 +0000 UTC m=+103.342876621 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls") pod "image-registry-5484bb4458-jcmv9" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76") : secret "image-registry-tls" not found Apr 23 13:34:01.188096 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.187993 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls podName:deda38e5-8a52-4797-a3fa-938eb8704a37 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:09.187973223 +0000 UTC m=+103.342895687 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbhzv" (UID: "deda38e5-8a52-4797-a3fa-938eb8704a37") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:01.289162 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.289130 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:01.289338 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.289201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:01.289338 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.289240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:34:01.289338 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.289275 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:34:01.289451 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.289338 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:34:01.289451 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.289345 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:09.289329961 +0000 UTC m=+103.444252406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : secret "router-metrics-certs-default" not found Apr 23 13:34:01.289451 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.289360 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle podName:c3812d73-c709-4f23-aa36-2623bc03faf0 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:09.289354002 +0000 UTC m=+103.444276449 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle") pod "router-default-5c4d58d76d-p9647" (UID: "c3812d73-c709-4f23-aa36-2623bc03faf0") : configmap references non-existent config key: service-ca.crt Apr 23 13:34:01.289451 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:01.289375 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls podName:dd1d91d8-09b0-43ef-9971-5c19edba64a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:09.289366672 +0000 UTC m=+103.444289119 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c228x" (UID: "dd1d91d8-09b0-43ef-9971-5c19edba64a2") : secret "samples-operator-tls" not found Apr 23 13:34:01.386082 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.386026 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-m9gfb"] Apr 23 13:34:01.390234 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.390217 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.392706 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.392682 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 13:34:01.392841 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.392684 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-2tktr\"" Apr 23 13:34:01.393564 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.393538 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 13:34:01.393564 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.393553 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 13:34:01.393706 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.393621 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 13:34:01.397424 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.397404 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-m9gfb"] Apr 23 13:34:01.491328 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.491245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cff591e7-f3c1-4663-8abc-6415052ed401-signing-cabundle\") pod \"service-ca-865cb79987-m9gfb\" (UID: \"cff591e7-f3c1-4663-8abc-6415052ed401\") " pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.491328 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.491281 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cff591e7-f3c1-4663-8abc-6415052ed401-signing-key\") pod \"service-ca-865cb79987-m9gfb\" (UID: \"cff591e7-f3c1-4663-8abc-6415052ed401\") " pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.491328 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.491300 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzl7m\" (UniqueName: \"kubernetes.io/projected/cff591e7-f3c1-4663-8abc-6415052ed401-kube-api-access-fzl7m\") pod \"service-ca-865cb79987-m9gfb\" (UID: \"cff591e7-f3c1-4663-8abc-6415052ed401\") " pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.592178 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.592121 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cff591e7-f3c1-4663-8abc-6415052ed401-signing-cabundle\") pod \"service-ca-865cb79987-m9gfb\" (UID: \"cff591e7-f3c1-4663-8abc-6415052ed401\") " pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.592178 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.592176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cff591e7-f3c1-4663-8abc-6415052ed401-signing-key\") pod \"service-ca-865cb79987-m9gfb\" (UID: \"cff591e7-f3c1-4663-8abc-6415052ed401\") " pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.592434 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.592199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzl7m\" (UniqueName: \"kubernetes.io/projected/cff591e7-f3c1-4663-8abc-6415052ed401-kube-api-access-fzl7m\") pod \"service-ca-865cb79987-m9gfb\" (UID: \"cff591e7-f3c1-4663-8abc-6415052ed401\") " pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.592798 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.592776 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cff591e7-f3c1-4663-8abc-6415052ed401-signing-cabundle\") pod \"service-ca-865cb79987-m9gfb\" (UID: \"cff591e7-f3c1-4663-8abc-6415052ed401\") " pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.595007 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.594974 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cff591e7-f3c1-4663-8abc-6415052ed401-signing-key\") pod \"service-ca-865cb79987-m9gfb\" (UID: \"cff591e7-f3c1-4663-8abc-6415052ed401\") " pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.600455 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.600431 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzl7m\" (UniqueName: \"kubernetes.io/projected/cff591e7-f3c1-4663-8abc-6415052ed401-kube-api-access-fzl7m\") pod \"service-ca-865cb79987-m9gfb\" (UID: \"cff591e7-f3c1-4663-8abc-6415052ed401\") " pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.699192 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.699159 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-m9gfb" Apr 23 13:34:01.818751 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.818659 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-m9gfb"] Apr 23 13:34:01.822391 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:01.822364 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff591e7_f3c1_4663_8abc_6415052ed401.slice/crio-b9fdbde06c35a8d45be337fdc266036b9fcc9e1f79d480f19c5f1a8b95f97aa6 WatchSource:0}: Error finding container b9fdbde06c35a8d45be337fdc266036b9fcc9e1f79d480f19c5f1a8b95f97aa6: Status 404 returned error can't find the container with id b9fdbde06c35a8d45be337fdc266036b9fcc9e1f79d480f19c5f1a8b95f97aa6 Apr 23 13:34:01.882884 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:01.882865 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xwbp9_29bbf816-c174-4330-b3f2-ded908db0f6a/node-ca/0.log" Apr 23 13:34:02.748206 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:02.748166 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-m9gfb" event={"ID":"cff591e7-f3c1-4663-8abc-6415052ed401","Type":"ContainerStarted","Data":"33fcd7470445fb93df92602c619b786792b3738f75b8843a2ac62dcfb57a12be"} Apr 23 13:34:02.748677 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:02.748212 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-m9gfb" event={"ID":"cff591e7-f3c1-4663-8abc-6415052ed401","Type":"ContainerStarted","Data":"b9fdbde06c35a8d45be337fdc266036b9fcc9e1f79d480f19c5f1a8b95f97aa6"} Apr 23 13:34:02.774401 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:02.774337 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-m9gfb" podStartSLOduration=1.774313711 podStartE2EDuration="1.774313711s" podCreationTimestamp="2026-04-23 13:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:02.772608118 +0000 UTC m=+96.927530586" watchObservedRunningTime="2026-04-23 13:34:02.774313711 +0000 UTC m=+96.929236379" Apr 23 13:34:02.806173 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:02.806126 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:34:02.807442 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:02.806496 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:34:02.807675 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:02.807664 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert podName:3f8dae65-c604-469a-abf3-bc8ac066bcd8 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:10.80762635 +0000 UTC m=+104.962548802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ghb6b" (UID: "3f8dae65-c604-469a-abf3-bc8ac066bcd8") : secret "networking-console-plugin-cert" not found Apr 23 13:34:03.578903 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:03.578863 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:34:03.578903 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:03.578896 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:34:03.579278 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:03.579265 2569 scope.go:117] "RemoveContainer" containerID="5d293b33840c05358db6930c1af18f7202171050231407f77c06c701fd81f1fa" Apr 23 13:34:03.579436 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:03.579420 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4666z_openshift-console-operator(2af8933e-b7d0-4a15-a43e-c2a76d750555)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" podUID="2af8933e-b7d0-4a15-a43e-c2a76d750555" Apr 23 13:34:03.614400 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:03.614294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:34:03.614400 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:03.614396 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:34:03.614655 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:03.614455 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls podName:33d8f26a-427d-4263-9b87-13337ac3a834 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:07.614435481 +0000 UTC m=+161.769357928 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls") pod "dns-default-h8fxx" (UID: "33d8f26a-427d-4263-9b87-13337ac3a834") : secret "dns-default-metrics-tls" not found Apr 23 13:34:03.715838 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:03.715791 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:34:03.716052 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:03.715963 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:34:03.716052 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:03.716040 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert podName:bb994adb-00a6-4601-83e9-80e43ab53049 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:07.716020989 +0000 UTC m=+161.870943448 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert") pod "ingress-canary-ggnhj" (UID: "bb994adb-00a6-4601-83e9-80e43ab53049") : secret "canary-serving-cert" not found Apr 23 13:34:06.658235 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:06.658195 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zw7vm" Apr 23 13:34:09.266483 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.266443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:34:09.266967 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.266493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:34:09.266967 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:09.266593 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:09.266967 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:09.266667 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls podName:deda38e5-8a52-4797-a3fa-938eb8704a37 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:25.266647742 +0000 UTC m=+119.421570188 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbhzv" (UID: "deda38e5-8a52-4797-a3fa-938eb8704a37") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:09.269155 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.269132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") pod \"image-registry-5484bb4458-jcmv9\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:34:09.297243 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.297207 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:34:09.367303 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.367269 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:09.367485 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.367370 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:09.367485 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.367410 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:34:09.367964 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.367937 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3812d73-c709-4f23-aa36-2623bc03faf0-service-ca-bundle\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:09.369846 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.369820 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd1d91d8-09b0-43ef-9971-5c19edba64a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c228x\" (UID: \"dd1d91d8-09b0-43ef-9971-5c19edba64a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:34:09.370806 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.370720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3812d73-c709-4f23-aa36-2623bc03faf0-metrics-certs\") pod \"router-default-5c4d58d76d-p9647\" (UID: \"c3812d73-c709-4f23-aa36-2623bc03faf0\") " pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:09.383601 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.383251 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" Apr 23 13:34:09.388224 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.388198 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:09.440088 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.439332 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5484bb4458-jcmv9"] Apr 23 13:34:09.544778 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.544746 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x"] Apr 23 13:34:09.568035 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.566875 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5c4d58d76d-p9647"] Apr 23 13:34:09.766255 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.766217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" event={"ID":"71703531-7aaa-494c-9ad6-e9b94573df76","Type":"ContainerStarted","Data":"72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847"} Apr 23 13:34:09.766255 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.766257 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" event={"ID":"71703531-7aaa-494c-9ad6-e9b94573df76","Type":"ContainerStarted","Data":"2b5c38ad8f4318db54decc7daa679427755a12976fa6d7dc0bf2b2ba06a6c699"} Apr 23 13:34:09.766503 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.766339 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:34:09.767686 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.767656 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5c4d58d76d-p9647" event={"ID":"c3812d73-c709-4f23-aa36-2623bc03faf0","Type":"ContainerStarted","Data":"4dfa4691b7be5bbb25d63d2e244bf6bebd095d0fe9b326bf3bd8a9bd063cda13"} Apr 23 13:34:09.767820 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.767691 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5c4d58d76d-p9647" event={"ID":"c3812d73-c709-4f23-aa36-2623bc03faf0","Type":"ContainerStarted","Data":"25c8a0dd5a92cacf57020463e252861f34a07d74988e0a719ae66c389f5b3ed4"} Apr 23 13:34:09.768667 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.768643 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" event={"ID":"dd1d91d8-09b0-43ef-9971-5c19edba64a2","Type":"ContainerStarted","Data":"3cec2b0db27ca9e3e839adb69547f532a2ac5a898797df3194e244b27bca8834"} Apr 23 13:34:09.784739 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.784642 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" podStartSLOduration=16.784628099 podStartE2EDuration="16.784628099s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:09.783991567 +0000 UTC m=+103.938914036" watchObservedRunningTime="2026-04-23 13:34:09.784628099 +0000 UTC m=+103.939550567" Apr 23 13:34:09.802189 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:09.802132 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5c4d58d76d-p9647" podStartSLOduration=16.802111161 podStartE2EDuration="16.802111161s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:09.801437099 +0000 UTC m=+103.956359569" watchObservedRunningTime="2026-04-23 13:34:09.802111161 +0000 UTC m=+103.957033631" Apr 23 13:34:10.388588 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:10.388548 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:10.391469 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:10.391445 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:10.773546 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:10.773513 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:10.774991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:10.774964 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5c4d58d76d-p9647" Apr 23 13:34:10.879749 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:10.879717 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:34:10.883117 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:10.883087 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f8dae65-c604-469a-abf3-bc8ac066bcd8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ghb6b\" (UID: \"3f8dae65-c604-469a-abf3-bc8ac066bcd8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:34:11.178533 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:11.178461 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" Apr 23 13:34:11.315425 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:11.315393 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b"] Apr 23 13:34:11.320529 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:11.320485 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f8dae65_c604_469a_abf3_bc8ac066bcd8.slice/crio-607622704eb881860607db6cb17ea73230cdf560337bcc1394c303b261fefe74 WatchSource:0}: Error finding container 607622704eb881860607db6cb17ea73230cdf560337bcc1394c303b261fefe74: Status 404 returned error can't find the container with id 607622704eb881860607db6cb17ea73230cdf560337bcc1394c303b261fefe74 Apr 23 13:34:11.777715 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:11.777675 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" event={"ID":"dd1d91d8-09b0-43ef-9971-5c19edba64a2","Type":"ContainerStarted","Data":"3dad4846c8014090f214f3c38821b74be3eae55240c4aaa52f8da2d39b9ad53e"} Apr 23 13:34:11.777715 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:11.777720 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" event={"ID":"dd1d91d8-09b0-43ef-9971-5c19edba64a2","Type":"ContainerStarted","Data":"80075164d0a0d42e8a77191136e6f9061f5315428ba643f68cfcbad61a913303"} Apr 23 13:34:11.778735 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:11.778715 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" event={"ID":"3f8dae65-c604-469a-abf3-bc8ac066bcd8","Type":"ContainerStarted","Data":"607622704eb881860607db6cb17ea73230cdf560337bcc1394c303b261fefe74"} Apr 23 13:34:11.794475 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:11.794422 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c228x" podStartSLOduration=17.228039623 podStartE2EDuration="18.794406988s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="2026-04-23 13:34:09.626995002 +0000 UTC m=+103.781917458" lastFinishedPulling="2026-04-23 13:34:11.19336237 +0000 UTC m=+105.348284823" observedRunningTime="2026-04-23 13:34:11.793555702 +0000 UTC m=+105.948478171" watchObservedRunningTime="2026-04-23 13:34:11.794406988 +0000 UTC m=+105.949329456" Apr 23 13:34:12.782170 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:12.782133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" event={"ID":"3f8dae65-c604-469a-abf3-bc8ac066bcd8","Type":"ContainerStarted","Data":"9280408e5fdcc92b617d29dd01590f7ad49c7656fb393cb61bcd9fabf3fdc6ce"} Apr 23 13:34:12.798931 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:12.798876 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ghb6b" podStartSLOduration=17.705660652 podStartE2EDuration="18.798859583s" podCreationTimestamp="2026-04-23 13:33:54 +0000 UTC" firstStartedPulling="2026-04-23 13:34:11.322461751 +0000 UTC m=+105.477384210" lastFinishedPulling="2026-04-23 13:34:12.415660685 +0000 UTC m=+106.570583141" observedRunningTime="2026-04-23 13:34:12.798462175 +0000 UTC m=+106.953384664" watchObservedRunningTime="2026-04-23 13:34:12.798859583 +0000 UTC m=+106.953782054" Apr 23 13:34:16.418051 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:16.418014 2569 scope.go:117] "RemoveContainer" containerID="5d293b33840c05358db6930c1af18f7202171050231407f77c06c701fd81f1fa" Apr 23 13:34:16.793128 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:16.793099 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:34:16.793454 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:16.793434 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/1.log" Apr 23 13:34:16.793574 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:16.793474 2569 generic.go:358] "Generic (PLEG): container finished" podID="2af8933e-b7d0-4a15-a43e-c2a76d750555" containerID="c6991cf8f0e8c0f389b876b28ae71b59a5229383a838f9f77d6106935e5ece8e" exitCode=255 Apr 23 13:34:16.793574 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:16.793527 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" event={"ID":"2af8933e-b7d0-4a15-a43e-c2a76d750555","Type":"ContainerDied","Data":"c6991cf8f0e8c0f389b876b28ae71b59a5229383a838f9f77d6106935e5ece8e"} Apr 23 13:34:16.793574 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:16.793568 2569 scope.go:117] "RemoveContainer" containerID="5d293b33840c05358db6930c1af18f7202171050231407f77c06c701fd81f1fa" Apr 23 13:34:16.793955 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:16.793934 2569 scope.go:117] "RemoveContainer" containerID="c6991cf8f0e8c0f389b876b28ae71b59a5229383a838f9f77d6106935e5ece8e" Apr 23 13:34:16.794176 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:16.794155 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-4666z_openshift-console-operator(2af8933e-b7d0-4a15-a43e-c2a76d750555)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" podUID="2af8933e-b7d0-4a15-a43e-c2a76d750555" Apr 23 13:34:17.800666 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:17.800636 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:34:23.579446 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:23.579413 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:34:23.579446 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:23.579453 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:34:23.579882 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:23.579827 2569 scope.go:117] "RemoveContainer" containerID="c6991cf8f0e8c0f389b876b28ae71b59a5229383a838f9f77d6106935e5ece8e" Apr 23 13:34:23.580025 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:23.580006 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-4666z_openshift-console-operator(2af8933e-b7d0-4a15-a43e-c2a76d750555)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" podUID="2af8933e-b7d0-4a15-a43e-c2a76d750555" Apr 23 13:34:25.296968 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:25.296921 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:34:25.299536 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:25.299505 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/deda38e5-8a52-4797-a3fa-938eb8704a37-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbhzv\" (UID: \"deda38e5-8a52-4797-a3fa-938eb8704a37\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:34:25.468955 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:25.468906 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" Apr 23 13:34:25.591209 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:25.591132 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv"] Apr 23 13:34:25.594112 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:25.594082 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeda38e5_8a52_4797_a3fa_938eb8704a37.slice/crio-bd602074577dd8f982ba1019255a42796d6523d8da57dbd979305feb7459144a WatchSource:0}: Error finding container bd602074577dd8f982ba1019255a42796d6523d8da57dbd979305feb7459144a: Status 404 returned error can't find the container with id bd602074577dd8f982ba1019255a42796d6523d8da57dbd979305feb7459144a Apr 23 13:34:25.826269 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:25.826224 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" event={"ID":"deda38e5-8a52-4797-a3fa-938eb8704a37","Type":"ContainerStarted","Data":"bd602074577dd8f982ba1019255a42796d6523d8da57dbd979305feb7459144a"} Apr 23 13:34:26.474706 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.474675 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5484bb4458-jcmv9"] Apr 23 13:34:26.480088 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.479928 2569 patch_prober.go:28] interesting pod/image-registry-5484bb4458-jcmv9 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:26.480088 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.480003 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" podUID="71703531-7aaa-494c-9ad6-e9b94573df76" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:26.517294 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.517265 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mp5jl"] Apr 23 13:34:26.519740 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.519712 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.528488 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.528458 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:34:26.528683 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.528648 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:34:26.528804 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.528729 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-9khvh\"" Apr 23 13:34:26.545860 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.545795 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mp5jl"] Apr 23 13:34:26.606956 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.606917 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.606956 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.606967 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-data-volume\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.607270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.607026 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-crio-socket\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.607270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.607078 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.607270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.607128 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlntz\" (UniqueName: \"kubernetes.io/projected/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-kube-api-access-dlntz\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.708522 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.708429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-crio-socket\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.708522 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.708484 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.708736 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.708589 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-crio-socket\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.708736 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.708710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlntz\" (UniqueName: \"kubernetes.io/projected/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-kube-api-access-dlntz\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.708831 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.708811 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.708881 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.708854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-data-volume\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.709521 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.709498 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-data-volume\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.709671 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.709654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.711878 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.711848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.718772 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.718743 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlntz\" (UniqueName: \"kubernetes.io/projected/c5ddcfc3-4c39-49b6-93b5-972a1e87f960-kube-api-access-dlntz\") pod \"insights-runtime-extractor-mp5jl\" (UID: \"c5ddcfc3-4c39-49b6-93b5-972a1e87f960\") " pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:26.831605 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:26.831573 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mp5jl" Apr 23 13:34:27.253939 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.253899 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mp5jl"] Apr 23 13:34:27.257053 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:27.257023 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5ddcfc3_4c39_49b6_93b5_972a1e87f960.slice/crio-77e823ad76beb4baed1273cac0042f56b590c52eca92a3346a4c34021c503d2e WatchSource:0}: Error finding container 77e823ad76beb4baed1273cac0042f56b590c52eca92a3346a4c34021c503d2e: Status 404 returned error can't find the container with id 77e823ad76beb4baed1273cac0042f56b590c52eca92a3346a4c34021c503d2e Apr 23 13:34:27.657229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.657199 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj"] Apr 23 13:34:27.659261 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.659242 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" Apr 23 13:34:27.661581 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.661548 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-7ntss\"" Apr 23 13:34:27.661703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.661588 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 13:34:27.671127 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.671100 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj"] Apr 23 13:34:27.821718 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.821688 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d85dfdc8-89bf-4595-aee1-344627449373-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bbxdj\" (UID: \"d85dfdc8-89bf-4595-aee1-344627449373\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" Apr 23 13:34:27.833825 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.833789 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mp5jl" event={"ID":"c5ddcfc3-4c39-49b6-93b5-972a1e87f960","Type":"ContainerStarted","Data":"ec74c0654270e5f5cfa8b5de2753e4a0f21853a5da00ef0794e4683b89b3ab1c"} Apr 23 13:34:27.833825 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.833831 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mp5jl" event={"ID":"c5ddcfc3-4c39-49b6-93b5-972a1e87f960","Type":"ContainerStarted","Data":"77e823ad76beb4baed1273cac0042f56b590c52eca92a3346a4c34021c503d2e"} Apr 23 13:34:27.835249 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.835213 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" event={"ID":"deda38e5-8a52-4797-a3fa-938eb8704a37","Type":"ContainerStarted","Data":"57f45af5b3a2c7344b5aa1cd914cd2408fac4bbcda4557b7dba546d3420975d6"} Apr 23 13:34:27.853047 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.852991 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbhzv" podStartSLOduration=33.298368783 podStartE2EDuration="34.852970352s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="2026-04-23 13:34:25.596007831 +0000 UTC m=+119.750930276" lastFinishedPulling="2026-04-23 13:34:27.150609399 +0000 UTC m=+121.305531845" observedRunningTime="2026-04-23 13:34:27.851629192 +0000 UTC m=+122.006551662" watchObservedRunningTime="2026-04-23 13:34:27.852970352 +0000 UTC m=+122.007892895" Apr 23 13:34:27.922680 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.922595 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d85dfdc8-89bf-4595-aee1-344627449373-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bbxdj\" (UID: \"d85dfdc8-89bf-4595-aee1-344627449373\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" Apr 23 13:34:27.925212 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.925186 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d85dfdc8-89bf-4595-aee1-344627449373-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bbxdj\" (UID: \"d85dfdc8-89bf-4595-aee1-344627449373\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" Apr 23 13:34:27.967975 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:27.967934 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" Apr 23 13:34:28.091789 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:28.091753 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj"] Apr 23 13:34:28.094881 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:28.094850 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd85dfdc8_89bf_4595_aee1_344627449373.slice/crio-04595f3adae9a65544a06daddafb2c8aa7569626244110a9f3de6636b0fa5870 WatchSource:0}: Error finding container 04595f3adae9a65544a06daddafb2c8aa7569626244110a9f3de6636b0fa5870: Status 404 returned error can't find the container with id 04595f3adae9a65544a06daddafb2c8aa7569626244110a9f3de6636b0fa5870 Apr 23 13:34:28.840389 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:28.840332 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mp5jl" event={"ID":"c5ddcfc3-4c39-49b6-93b5-972a1e87f960","Type":"ContainerStarted","Data":"5c72926eb125bc124cec18564a20bb214372084d9a4799c7620060699530ac15"} Apr 23 13:34:28.841562 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:28.841521 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" event={"ID":"d85dfdc8-89bf-4595-aee1-344627449373","Type":"ContainerStarted","Data":"04595f3adae9a65544a06daddafb2c8aa7569626244110a9f3de6636b0fa5870"} Apr 23 13:34:29.846106 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:29.845998 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mp5jl" event={"ID":"c5ddcfc3-4c39-49b6-93b5-972a1e87f960","Type":"ContainerStarted","Data":"061c413ee4f2e1c774b6a3667616f8111980d8ca02911d1485b52bef12859453"} Apr 23 13:34:29.847295 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:29.847270 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" event={"ID":"d85dfdc8-89bf-4595-aee1-344627449373","Type":"ContainerStarted","Data":"5b6a9738a467990519ee2d188f8aff66a3200adc89fe8f903c38634758746ad6"} Apr 23 13:34:29.847473 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:29.847448 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" Apr 23 13:34:29.852311 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:29.852287 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" Apr 23 13:34:29.866576 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:29.866529 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mp5jl" podStartSLOduration=1.612956027 podStartE2EDuration="3.866515872s" podCreationTimestamp="2026-04-23 13:34:26 +0000 UTC" firstStartedPulling="2026-04-23 13:34:27.328986943 +0000 UTC m=+121.483909400" lastFinishedPulling="2026-04-23 13:34:29.582546796 +0000 UTC m=+123.737469245" observedRunningTime="2026-04-23 13:34:29.86487535 +0000 UTC m=+124.019797851" watchObservedRunningTime="2026-04-23 13:34:29.866515872 +0000 UTC m=+124.021438340" Apr 23 13:34:29.886984 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:29.886921 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bbxdj" podStartSLOduration=1.404510605 podStartE2EDuration="2.886904187s" podCreationTimestamp="2026-04-23 13:34:27 +0000 UTC" firstStartedPulling="2026-04-23 13:34:28.096710853 +0000 UTC m=+122.251633300" lastFinishedPulling="2026-04-23 13:34:29.579104433 +0000 UTC m=+123.734026882" observedRunningTime="2026-04-23 13:34:29.885334799 +0000 UTC m=+124.040257257" watchObservedRunningTime="2026-04-23 13:34:29.886904187 +0000 UTC m=+124.041826655" Apr 23 13:34:30.726450 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.726418 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5h8g8"] Apr 23 13:34:30.729662 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.729647 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.733414 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.733378 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 13:34:30.733558 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.733440 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:34:30.733558 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.733472 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-d449c\"" Apr 23 13:34:30.733558 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.733386 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 13:34:30.738384 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.738354 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5h8g8"] Apr 23 13:34:30.847170 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.847124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhn9\" (UniqueName: \"kubernetes.io/projected/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-kube-api-access-zjhn9\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.847620 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.847279 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.847620 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.847348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.847620 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.847389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.948888 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.948846 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.949088 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:30.948972 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 13:34:30.949088 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.948943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.949088 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:30.949043 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-prometheus-operator-tls podName:ad5caa19-ed79-4a2f-83b0-55eb874b39f5 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:31.449021711 +0000 UTC m=+125.603944157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-5h8g8" (UID: "ad5caa19-ed79-4a2f-83b0-55eb874b39f5") : secret "prometheus-operator-tls" not found Apr 23 13:34:30.949271 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.949124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.949271 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.949263 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhn9\" (UniqueName: \"kubernetes.io/projected/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-kube-api-access-zjhn9\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.949827 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.949804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.951649 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.951626 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:30.959008 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:30.958990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhn9\" (UniqueName: \"kubernetes.io/projected/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-kube-api-access-zjhn9\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:31.453048 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:31.453015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:31.455548 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:31.455523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad5caa19-ed79-4a2f-83b0-55eb874b39f5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5h8g8\" (UID: \"ad5caa19-ed79-4a2f-83b0-55eb874b39f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:31.639280 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:31.639235 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" Apr 23 13:34:31.768945 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:31.768916 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5h8g8"] Apr 23 13:34:31.773282 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:31.773243 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad5caa19_ed79_4a2f_83b0_55eb874b39f5.slice/crio-0261f5a8279fe8a408d90e53989467675ab776d1fa134179523c3399b463cac8 WatchSource:0}: Error finding container 0261f5a8279fe8a408d90e53989467675ab776d1fa134179523c3399b463cac8: Status 404 returned error can't find the container with id 0261f5a8279fe8a408d90e53989467675ab776d1fa134179523c3399b463cac8 Apr 23 13:34:31.853580 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:31.853539 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" event={"ID":"ad5caa19-ed79-4a2f-83b0-55eb874b39f5","Type":"ContainerStarted","Data":"0261f5a8279fe8a408d90e53989467675ab776d1fa134179523c3399b463cac8"} Apr 23 13:34:33.860985 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:33.860943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" event={"ID":"ad5caa19-ed79-4a2f-83b0-55eb874b39f5","Type":"ContainerStarted","Data":"69bee76c77582dba52ea5bc2c5c9dd76eecbf2fa0064d2e7650d4f429b88f15b"} Apr 23 13:34:33.860985 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:33.860988 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" event={"ID":"ad5caa19-ed79-4a2f-83b0-55eb874b39f5","Type":"ContainerStarted","Data":"ac3014c75390e449a69fd31d093e5e10861ad26b1ddd3d143ff6c80df664248e"} Apr 23 13:34:33.879833 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:33.879782 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-5h8g8" podStartSLOduration=2.678096471 podStartE2EDuration="3.879767271s" podCreationTimestamp="2026-04-23 13:34:30 +0000 UTC" firstStartedPulling="2026-04-23 13:34:31.775566841 +0000 UTC m=+125.930489286" lastFinishedPulling="2026-04-23 13:34:32.97723764 +0000 UTC m=+127.132160086" observedRunningTime="2026-04-23 13:34:33.879050086 +0000 UTC m=+128.033972597" watchObservedRunningTime="2026-04-23 13:34:33.879767271 +0000 UTC m=+128.034689748" Apr 23 13:34:36.095380 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.095345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:34:36.097841 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.097815 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7a9b0c-42a9-4562-a03a-27dca913446a-metrics-certs\") pod \"network-metrics-daemon-dqcwj\" (UID: \"dc7a9b0c-42a9-4562-a03a-27dca913446a\") " pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:34:36.137819 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.137785 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xmqx4"] Apr 23 13:34:36.143131 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.143028 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.145780 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.145748 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:34:36.145956 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.145826 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-24rls\"" Apr 23 13:34:36.145956 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.145904 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:34:36.146348 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.146328 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:34:36.228743 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.228714 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rl8cq\"" Apr 23 13:34:36.237158 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.237130 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dqcwj" Apr 23 13:34:36.297561 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.297525 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-wtmp\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.297707 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.297614 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-accelerators-collector-config\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.297707 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.297648 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-tls\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.297707 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.297683 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwnbn\" (UniqueName: \"kubernetes.io/projected/c6aa409f-f25a-46d2-83bc-229d8993033a-kube-api-access-cwnbn\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.297876 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.297729 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-textfile\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.297876 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.297759 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.297876 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.297799 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6aa409f-f25a-46d2-83bc-229d8993033a-metrics-client-ca\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.297876 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.297853 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6aa409f-f25a-46d2-83bc-229d8993033a-sys\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.298101 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.297884 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c6aa409f-f25a-46d2-83bc-229d8993033a-root\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.367278 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.367206 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dqcwj"] Apr 23 13:34:36.371163 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:36.371138 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc7a9b0c_42a9_4562_a03a_27dca913446a.slice/crio-1c434c6221f64f93183f998f133e241cc6a76aebc30044a2cc0eeb31997a180c WatchSource:0}: Error finding container 1c434c6221f64f93183f998f133e241cc6a76aebc30044a2cc0eeb31997a180c: Status 404 returned error can't find the container with id 1c434c6221f64f93183f998f133e241cc6a76aebc30044a2cc0eeb31997a180c Apr 23 13:34:36.399260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399228 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-accelerators-collector-config\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399269 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-tls\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399293 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwnbn\" (UniqueName: \"kubernetes.io/projected/c6aa409f-f25a-46d2-83bc-229d8993033a-kube-api-access-cwnbn\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399325 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-textfile\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399353 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6aa409f-f25a-46d2-83bc-229d8993033a-metrics-client-ca\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399430 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6aa409f-f25a-46d2-83bc-229d8993033a-sys\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399740 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399459 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c6aa409f-f25a-46d2-83bc-229d8993033a-root\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399740 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399487 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-wtmp\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399740 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6aa409f-f25a-46d2-83bc-229d8993033a-sys\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399740 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399588 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c6aa409f-f25a-46d2-83bc-229d8993033a-root\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399740 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-wtmp\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.399740 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399721 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-textfile\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.400029 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.399913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-accelerators-collector-config\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.400177 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.400135 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6aa409f-f25a-46d2-83bc-229d8993033a-metrics-client-ca\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.402049 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.402020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.402178 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.402085 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c6aa409f-f25a-46d2-83bc-229d8993033a-node-exporter-tls\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.411439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.411415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwnbn\" (UniqueName: \"kubernetes.io/projected/c6aa409f-f25a-46d2-83bc-229d8993033a-kube-api-access-cwnbn\") pod \"node-exporter-xmqx4\" (UID: \"c6aa409f-f25a-46d2-83bc-229d8993033a\") " pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.457195 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.457105 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xmqx4" Apr 23 13:34:36.465581 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:36.465554 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6aa409f_f25a_46d2_83bc_229d8993033a.slice/crio-a0bed88068835f378dbb5bc5aa811e7932051aeac42b377066f0f4f4b332aad6 WatchSource:0}: Error finding container a0bed88068835f378dbb5bc5aa811e7932051aeac42b377066f0f4f4b332aad6: Status 404 returned error can't find the container with id a0bed88068835f378dbb5bc5aa811e7932051aeac42b377066f0f4f4b332aad6 Apr 23 13:34:36.479391 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.479363 2569 patch_prober.go:28] interesting pod/image-registry-5484bb4458-jcmv9 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:36.479500 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.479414 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" podUID="71703531-7aaa-494c-9ad6-e9b94573df76" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:36.871157 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.871098 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dqcwj" event={"ID":"dc7a9b0c-42a9-4562-a03a-27dca913446a","Type":"ContainerStarted","Data":"1c434c6221f64f93183f998f133e241cc6a76aebc30044a2cc0eeb31997a180c"} Apr 23 13:34:36.872391 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:36.872358 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xmqx4" event={"ID":"c6aa409f-f25a-46d2-83bc-229d8993033a","Type":"ContainerStarted","Data":"a0bed88068835f378dbb5bc5aa811e7932051aeac42b377066f0f4f4b332aad6"} Apr 23 13:34:37.416506 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.416470 2569 scope.go:117] "RemoveContainer" containerID="c6991cf8f0e8c0f389b876b28ae71b59a5229383a838f9f77d6106935e5ece8e" Apr 23 13:34:37.878124 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.878084 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dqcwj" event={"ID":"dc7a9b0c-42a9-4562-a03a-27dca913446a","Type":"ContainerStarted","Data":"9433ea605ef1630e529081c5ef2d104b5af8b3fbcbc44130d5b8bdef448528ee"} Apr 23 13:34:37.878124 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.878130 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dqcwj" event={"ID":"dc7a9b0c-42a9-4562-a03a-27dca913446a","Type":"ContainerStarted","Data":"1750882aa234a870713b5de343b037eb856287e0f70087743f1eba0679c8b465"} Apr 23 13:34:37.879849 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.879823 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:34:37.879980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.879915 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" event={"ID":"2af8933e-b7d0-4a15-a43e-c2a76d750555","Type":"ContainerStarted","Data":"b654f1e06572daa838d3097e71d2868bb1cfbeb6311dd22d20845a9810102491"} Apr 23 13:34:37.880218 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.880199 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:34:37.881600 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.881572 2569 generic.go:358] "Generic (PLEG): container finished" podID="c6aa409f-f25a-46d2-83bc-229d8993033a" containerID="6caf5e2459082a0e7a8787ab50d9c59111335191201d5d238507911105d2eabe" exitCode=0 Apr 23 13:34:37.881711 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.881657 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xmqx4" event={"ID":"c6aa409f-f25a-46d2-83bc-229d8993033a","Type":"ContainerDied","Data":"6caf5e2459082a0e7a8787ab50d9c59111335191201d5d238507911105d2eabe"} Apr 23 13:34:37.892967 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.892920 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dqcwj" podStartSLOduration=130.783939035 podStartE2EDuration="2m11.892907422s" podCreationTimestamp="2026-04-23 13:32:26 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.373132437 +0000 UTC m=+130.528054887" lastFinishedPulling="2026-04-23 13:34:37.482100821 +0000 UTC m=+131.637023274" observedRunningTime="2026-04-23 13:34:37.891752341 +0000 UTC m=+132.046674811" watchObservedRunningTime="2026-04-23 13:34:37.892907422 +0000 UTC m=+132.047829915" Apr 23 13:34:37.949628 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:37.949579 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" podStartSLOduration=41.250822575 podStartE2EDuration="44.949560434s" podCreationTimestamp="2026-04-23 13:33:53 +0000 UTC" firstStartedPulling="2026-04-23 13:33:53.729668721 +0000 UTC m=+87.884591170" lastFinishedPulling="2026-04-23 13:33:57.428406583 +0000 UTC m=+91.583329029" observedRunningTime="2026-04-23 13:34:37.947299913 +0000 UTC m=+132.102222383" watchObservedRunningTime="2026-04-23 13:34:37.949560434 +0000 UTC m=+132.104482904" Apr 23 13:34:38.625663 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:38.625632 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-4666z" Apr 23 13:34:38.887458 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:38.887365 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xmqx4" event={"ID":"c6aa409f-f25a-46d2-83bc-229d8993033a","Type":"ContainerStarted","Data":"ee80cf19bed147aff17ff8b5ea78439e3a6cf8e6f05007805f481e0ff4b674b0"} Apr 23 13:34:38.887458 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:38.887413 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xmqx4" event={"ID":"c6aa409f-f25a-46d2-83bc-229d8993033a","Type":"ContainerStarted","Data":"2cdf98ccd81b5767e21f690d769830bef79fbdc5d069261be71e07a991cc25e8"} Apr 23 13:34:38.906449 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:38.906395 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xmqx4" podStartSLOduration=1.8991662009999999 podStartE2EDuration="2.906378425s" podCreationTimestamp="2026-04-23 13:34:36 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.467299332 +0000 UTC m=+130.622221778" lastFinishedPulling="2026-04-23 13:34:37.474511539 +0000 UTC m=+131.629434002" observedRunningTime="2026-04-23 13:34:38.905132176 +0000 UTC m=+133.060054644" watchObservedRunningTime="2026-04-23 13:34:38.906378425 +0000 UTC m=+133.061300894" Apr 23 13:34:40.846547 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:40.846511 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg"] Apr 23 13:34:40.849517 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:40.849501 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" Apr 23 13:34:40.852049 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:40.852016 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 13:34:40.852187 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:40.852044 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-fg8b8\"" Apr 23 13:34:40.861190 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:40.861166 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg"] Apr 23 13:34:41.039679 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:41.039616 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ad50b34-0c35-490e-a0b5-ad552f3b4cd5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jfppg\" (UID: \"0ad50b34-0c35-490e-a0b5-ad552f3b4cd5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" Apr 23 13:34:41.140880 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:41.140794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ad50b34-0c35-490e-a0b5-ad552f3b4cd5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jfppg\" (UID: \"0ad50b34-0c35-490e-a0b5-ad552f3b4cd5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" Apr 23 13:34:41.143450 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:41.143426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ad50b34-0c35-490e-a0b5-ad552f3b4cd5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jfppg\" (UID: \"0ad50b34-0c35-490e-a0b5-ad552f3b4cd5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" Apr 23 13:34:41.159422 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:41.159385 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" Apr 23 13:34:41.293266 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:41.293230 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg"] Apr 23 13:34:41.297778 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:41.297731 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad50b34_0c35_490e_a0b5_ad552f3b4cd5.slice/crio-d1672bced9a8791e270b6c0fe274bd4a0b4d869840fca0ea22105641c1c07836 WatchSource:0}: Error finding container d1672bced9a8791e270b6c0fe274bd4a0b4d869840fca0ea22105641c1c07836: Status 404 returned error can't find the container with id d1672bced9a8791e270b6c0fe274bd4a0b4d869840fca0ea22105641c1c07836 Apr 23 13:34:41.897991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:41.897946 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" event={"ID":"0ad50b34-0c35-490e-a0b5-ad552f3b4cd5","Type":"ContainerStarted","Data":"d1672bced9a8791e270b6c0fe274bd4a0b4d869840fca0ea22105641c1c07836"} Apr 23 13:34:42.901826 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:42.901772 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" event={"ID":"0ad50b34-0c35-490e-a0b5-ad552f3b4cd5","Type":"ContainerStarted","Data":"601d47effce5bda2d62f663ad09efd8bd2bec622911d465d5b87027e0d3e53a2"} Apr 23 13:34:42.902234 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:42.902032 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" Apr 23 13:34:42.906800 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:42.906776 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" Apr 23 13:34:42.918030 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:42.917984 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jfppg" podStartSLOduration=1.460201109 podStartE2EDuration="2.917968902s" podCreationTimestamp="2026-04-23 13:34:40 +0000 UTC" firstStartedPulling="2026-04-23 13:34:41.299594238 +0000 UTC m=+135.454516683" lastFinishedPulling="2026-04-23 13:34:42.757362008 +0000 UTC m=+136.912284476" observedRunningTime="2026-04-23 13:34:42.917335286 +0000 UTC m=+137.072257756" watchObservedRunningTime="2026-04-23 13:34:42.917968902 +0000 UTC m=+137.072891430" Apr 23 13:34:46.479560 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:46.479521 2569 patch_prober.go:28] interesting pod/image-registry-5484bb4458-jcmv9 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:46.479935 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:46.479575 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" podUID="71703531-7aaa-494c-9ad6-e9b94573df76" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:51.500713 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.500666 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" podUID="71703531-7aaa-494c-9ad6-e9b94573df76" containerName="registry" containerID="cri-o://72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847" gracePeriod=30 Apr 23 13:34:51.751489 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.751411 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:34:51.834775 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.834730 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-installation-pull-secrets\") pod \"71703531-7aaa-494c-9ad6-e9b94573df76\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " Apr 23 13:34:51.834775 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.834774 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-registry-certificates\") pod \"71703531-7aaa-494c-9ad6-e9b94573df76\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " Apr 23 13:34:51.835014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.834795 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") pod \"71703531-7aaa-494c-9ad6-e9b94573df76\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " Apr 23 13:34:51.835014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.834826 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svbhs\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-kube-api-access-svbhs\") pod \"71703531-7aaa-494c-9ad6-e9b94573df76\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " Apr 23 13:34:51.835014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.834854 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-image-registry-private-configuration\") pod \"71703531-7aaa-494c-9ad6-e9b94573df76\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " Apr 23 13:34:51.835014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.834882 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-trusted-ca\") pod \"71703531-7aaa-494c-9ad6-e9b94573df76\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " Apr 23 13:34:51.835014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.834907 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-bound-sa-token\") pod \"71703531-7aaa-494c-9ad6-e9b94573df76\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " Apr 23 13:34:51.835014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.834956 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71703531-7aaa-494c-9ad6-e9b94573df76-ca-trust-extracted\") pod \"71703531-7aaa-494c-9ad6-e9b94573df76\" (UID: \"71703531-7aaa-494c-9ad6-e9b94573df76\") " Apr 23 13:34:51.835320 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.835244 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "71703531-7aaa-494c-9ad6-e9b94573df76" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:51.835591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.835552 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "71703531-7aaa-494c-9ad6-e9b94573df76" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:51.837631 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.837598 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "71703531-7aaa-494c-9ad6-e9b94573df76" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:34:51.837743 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.837655 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "71703531-7aaa-494c-9ad6-e9b94573df76" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:34:51.837823 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.837798 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-kube-api-access-svbhs" (OuterVolumeSpecName: "kube-api-access-svbhs") pod "71703531-7aaa-494c-9ad6-e9b94573df76" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76"). InnerVolumeSpecName "kube-api-access-svbhs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:34:51.837823 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.837806 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "71703531-7aaa-494c-9ad6-e9b94573df76" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:34:51.837895 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.837818 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "71703531-7aaa-494c-9ad6-e9b94573df76" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:34:51.846826 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.846801 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71703531-7aaa-494c-9ad6-e9b94573df76-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "71703531-7aaa-494c-9ad6-e9b94573df76" (UID: "71703531-7aaa-494c-9ad6-e9b94573df76"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:34:51.927968 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.927932 2569 generic.go:358] "Generic (PLEG): container finished" podID="71703531-7aaa-494c-9ad6-e9b94573df76" containerID="72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847" exitCode=0 Apr 23 13:34:51.928166 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.927996 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" Apr 23 13:34:51.928166 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.928021 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" event={"ID":"71703531-7aaa-494c-9ad6-e9b94573df76","Type":"ContainerDied","Data":"72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847"} Apr 23 13:34:51.928166 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.928077 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5484bb4458-jcmv9" event={"ID":"71703531-7aaa-494c-9ad6-e9b94573df76","Type":"ContainerDied","Data":"2b5c38ad8f4318db54decc7daa679427755a12976fa6d7dc0bf2b2ba06a6c699"} Apr 23 13:34:51.928166 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.928096 2569 scope.go:117] "RemoveContainer" containerID="72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847" Apr 23 13:34:51.935900 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.935868 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-installation-pull-secrets\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:34:51.936014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.935911 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-registry-certificates\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:34:51.936014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.935928 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-registry-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:34:51.936014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.935943 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-svbhs\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-kube-api-access-svbhs\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:34:51.936014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.935959 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/71703531-7aaa-494c-9ad6-e9b94573df76-image-registry-private-configuration\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:34:51.936014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.935973 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71703531-7aaa-494c-9ad6-e9b94573df76-trusted-ca\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:34:51.936014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.935987 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71703531-7aaa-494c-9ad6-e9b94573df76-bound-sa-token\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:34:51.936014 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.936001 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71703531-7aaa-494c-9ad6-e9b94573df76-ca-trust-extracted\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:34:51.937000 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.936982 2569 scope.go:117] "RemoveContainer" containerID="72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847" Apr 23 13:34:51.937340 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:34:51.937317 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847\": container with ID starting with 72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847 not found: ID does not exist" containerID="72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847" Apr 23 13:34:51.937400 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.937349 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847"} err="failed to get container status \"72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847\": rpc error: code = NotFound desc = could not find container \"72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847\": container with ID starting with 72b54e121eaa0ac47f742cee6c652baf9d716213bb6da074c0a9e4de6deea847 not found: ID does not exist" Apr 23 13:34:51.950160 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.950124 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5484bb4458-jcmv9"] Apr 23 13:34:51.955085 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:51.955042 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5484bb4458-jcmv9"] Apr 23 13:34:52.420925 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:52.420887 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71703531-7aaa-494c-9ad6-e9b94573df76" path="/var/lib/kubelet/pods/71703531-7aaa-494c-9ad6-e9b94573df76/volumes" Apr 23 13:34:55.904769 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.904720 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86569766fc-dqn74"] Apr 23 13:34:55.905200 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.905138 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71703531-7aaa-494c-9ad6-e9b94573df76" containerName="registry" Apr 23 13:34:55.905200 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.905167 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="71703531-7aaa-494c-9ad6-e9b94573df76" containerName="registry" Apr 23 13:34:55.905270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.905222 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="71703531-7aaa-494c-9ad6-e9b94573df76" containerName="registry" Apr 23 13:34:55.910315 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.910294 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:55.912765 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.912741 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 13:34:55.913773 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.913734 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 13:34:55.913773 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.913756 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vhfkt\"" Apr 23 13:34:55.913931 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.913775 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 13:34:55.913931 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.913803 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 13:34:55.913931 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.913875 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 13:34:55.914039 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.914025 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 13:34:55.914168 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.914154 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 13:34:55.918140 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.918096 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 13:34:55.920268 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.920248 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86569766fc-dqn74"] Apr 23 13:34:55.971668 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.971623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-oauth-config\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:55.971870 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.971751 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-serving-cert\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:55.971870 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.971783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-trusted-ca-bundle\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:55.971870 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.971820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-service-ca\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:55.971870 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.971854 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxdw\" (UniqueName: \"kubernetes.io/projected/0deeaebe-d2f4-4804-8a26-9829c0d70462-kube-api-access-sbxdw\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:55.972046 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.971890 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-config\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:55.972046 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:55.971972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-oauth-serving-cert\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.073404 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.073368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxdw\" (UniqueName: \"kubernetes.io/projected/0deeaebe-d2f4-4804-8a26-9829c0d70462-kube-api-access-sbxdw\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.073404 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.073409 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-config\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.073646 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.073622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-oauth-serving-cert\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.073685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.073662 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-oauth-config\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.073776 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.073760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-serving-cert\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.073828 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.073800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-trusted-ca-bundle\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.073875 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.073829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-service-ca\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.074101 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.074078 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-config\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.074443 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.074348 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-oauth-serving-cert\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.074443 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.074439 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-service-ca\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.074712 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.074547 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-trusted-ca-bundle\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.076202 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.076185 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-oauth-config\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.076343 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.076326 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-serving-cert\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.083084 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.083040 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxdw\" (UniqueName: \"kubernetes.io/projected/0deeaebe-d2f4-4804-8a26-9829c0d70462-kube-api-access-sbxdw\") pod \"console-86569766fc-dqn74\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.219420 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.219323 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:34:56.346422 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.346395 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86569766fc-dqn74"] Apr 23 13:34:56.349260 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:34:56.349182 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0deeaebe_d2f4_4804_8a26_9829c0d70462.slice/crio-c915744d265662fa55d9aebffc8e64d8e92da8b7fc135925eeb663b504d22b53 WatchSource:0}: Error finding container c915744d265662fa55d9aebffc8e64d8e92da8b7fc135925eeb663b504d22b53: Status 404 returned error can't find the container with id c915744d265662fa55d9aebffc8e64d8e92da8b7fc135925eeb663b504d22b53 Apr 23 13:34:56.949897 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:56.949858 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86569766fc-dqn74" event={"ID":"0deeaebe-d2f4-4804-8a26-9829c0d70462","Type":"ContainerStarted","Data":"c915744d265662fa55d9aebffc8e64d8e92da8b7fc135925eeb663b504d22b53"} Apr 23 13:34:58.957888 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:58.957850 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86569766fc-dqn74" event={"ID":"0deeaebe-d2f4-4804-8a26-9829c0d70462","Type":"ContainerStarted","Data":"39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431"} Apr 23 13:34:58.978143 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:34:58.978085 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86569766fc-dqn74" podStartSLOduration=1.473530381 podStartE2EDuration="3.978070027s" podCreationTimestamp="2026-04-23 13:34:55 +0000 UTC" firstStartedPulling="2026-04-23 13:34:56.352674828 +0000 UTC m=+150.507597276" lastFinishedPulling="2026-04-23 13:34:58.857214473 +0000 UTC m=+153.012136922" observedRunningTime="2026-04-23 13:34:58.975438075 +0000 UTC m=+153.130360545" watchObservedRunningTime="2026-04-23 13:34:58.978070027 +0000 UTC m=+153.132992485" Apr 23 13:35:02.782642 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:35:02.782577 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-h8fxx" podUID="33d8f26a-427d-4263-9b87-13337ac3a834" Apr 23 13:35:02.799963 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:35:02.799915 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-ggnhj" podUID="bb994adb-00a6-4601-83e9-80e43ab53049" Apr 23 13:35:02.967667 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:02.967630 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h8fxx" Apr 23 13:35:03.971722 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:03.971688 2569 generic.go:358] "Generic (PLEG): container finished" podID="5cb70f7a-9760-40d9-b08b-b8115fb6bdf2" containerID="ca3dba1e0f14c749ccb8fc852d1e73dcd62e902c2a57eeebaf8fb77fc573d79e" exitCode=0 Apr 23 13:35:03.972119 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:03.971770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" event={"ID":"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2","Type":"ContainerDied","Data":"ca3dba1e0f14c749ccb8fc852d1e73dcd62e902c2a57eeebaf8fb77fc573d79e"} Apr 23 13:35:03.972163 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:03.972127 2569 scope.go:117] "RemoveContainer" containerID="ca3dba1e0f14c749ccb8fc852d1e73dcd62e902c2a57eeebaf8fb77fc573d79e" Apr 23 13:35:04.976031 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:04.975997 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-d2zwq" event={"ID":"5cb70f7a-9760-40d9-b08b-b8115fb6bdf2","Type":"ContainerStarted","Data":"0a102dd7f6a3edb9d709fb7975374e5721a3d48141970f920c4db3e83f71f833"} Apr 23 13:35:06.220001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:06.219962 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:35:06.220001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:06.220008 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:35:06.224897 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:06.224871 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:35:06.986321 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:06.986284 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:35:07.682383 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:07.682332 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:35:07.684925 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:07.684885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33d8f26a-427d-4263-9b87-13337ac3a834-metrics-tls\") pod \"dns-default-h8fxx\" (UID: \"33d8f26a-427d-4263-9b87-13337ac3a834\") " pod="openshift-dns/dns-default-h8fxx" Apr 23 13:35:07.770873 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:07.770836 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtx8h\"" Apr 23 13:35:07.779214 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:07.779166 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h8fxx" Apr 23 13:35:07.783404 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:07.783363 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:35:07.786165 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:07.786132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb994adb-00a6-4601-83e9-80e43ab53049-cert\") pod \"ingress-canary-ggnhj\" (UID: \"bb994adb-00a6-4601-83e9-80e43ab53049\") " pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:35:07.914386 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:07.914333 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h8fxx"] Apr 23 13:35:07.918227 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:35:07.918177 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d8f26a_427d_4263_9b87_13337ac3a834.slice/crio-11501c0caa643292eb026a9123f59f43d6712ebc4157bef5ef399e35c5129121 WatchSource:0}: Error finding container 11501c0caa643292eb026a9123f59f43d6712ebc4157bef5ef399e35c5129121: Status 404 returned error can't find the container with id 11501c0caa643292eb026a9123f59f43d6712ebc4157bef5ef399e35c5129121 Apr 23 13:35:07.987877 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:07.987778 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h8fxx" event={"ID":"33d8f26a-427d-4263-9b87-13337ac3a834","Type":"ContainerStarted","Data":"11501c0caa643292eb026a9123f59f43d6712ebc4157bef5ef399e35c5129121"} Apr 23 13:35:09.994742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:09.994699 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h8fxx" event={"ID":"33d8f26a-427d-4263-9b87-13337ac3a834","Type":"ContainerStarted","Data":"5c5e6c1c4e98b1ca9328d2038e25bccc7ae4bc37ffa555453eaa70db0e3bab3f"} Apr 23 13:35:09.994742 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:09.994740 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h8fxx" event={"ID":"33d8f26a-427d-4263-9b87-13337ac3a834","Type":"ContainerStarted","Data":"68cd20f17684830317ae67956450f25a97ca7d487d42a6943090d99fef4103bc"} Apr 23 13:35:09.995195 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:09.994840 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-h8fxx" Apr 23 13:35:10.014809 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:10.014753 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h8fxx" podStartSLOduration=129.698424332 podStartE2EDuration="2m11.014735846s" podCreationTimestamp="2026-04-23 13:32:59 +0000 UTC" firstStartedPulling="2026-04-23 13:35:07.921458371 +0000 UTC m=+162.076380817" lastFinishedPulling="2026-04-23 13:35:09.237769882 +0000 UTC m=+163.392692331" observedRunningTime="2026-04-23 13:35:10.0126188 +0000 UTC m=+164.167541261" watchObservedRunningTime="2026-04-23 13:35:10.014735846 +0000 UTC m=+164.169658367" Apr 23 13:35:14.012497 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:14.012463 2569 generic.go:358] "Generic (PLEG): container finished" podID="c4f65a5c-dbc0-4b33-825f-41c16ff92077" containerID="e0e90cb7d24fba978449d54ff4d577849e2e270f88acf2d3b0bb08b1d3eabdbd" exitCode=0 Apr 23 13:35:14.012964 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:14.012529 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5kw89" event={"ID":"c4f65a5c-dbc0-4b33-825f-41c16ff92077","Type":"ContainerDied","Data":"e0e90cb7d24fba978449d54ff4d577849e2e270f88acf2d3b0bb08b1d3eabdbd"} Apr 23 13:35:14.013053 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:14.012970 2569 scope.go:117] "RemoveContainer" containerID="e0e90cb7d24fba978449d54ff4d577849e2e270f88acf2d3b0bb08b1d3eabdbd" Apr 23 13:35:14.988782 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:14.988750 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-c228x_dd1d91d8-09b0-43ef-9971-5c19edba64a2/cluster-samples-operator/0.log" Apr 23 13:35:14.999984 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:14.999945 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-c228x_dd1d91d8-09b0-43ef-9971-5c19edba64a2/cluster-samples-operator-watch/0.log" Apr 23 13:35:15.018907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:15.018828 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5kw89" event={"ID":"c4f65a5c-dbc0-4b33-825f-41c16ff92077","Type":"ContainerStarted","Data":"d0ffcb83f477bd8bf1540087ac3ab75a32df43222f05113f8976383d8f4f04fb"} Apr 23 13:35:16.419454 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:16.419406 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:35:16.422371 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:16.422344 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8bjdp\"" Apr 23 13:35:16.430580 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:16.430542 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ggnhj" Apr 23 13:35:16.568485 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:16.568446 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ggnhj"] Apr 23 13:35:16.571820 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:35:16.571784 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb994adb_00a6_4601_83e9_80e43ab53049.slice/crio-e0dffa25be9f6046a718c62203628da9371c3196496e4c1b95588edfc67bcc0e WatchSource:0}: Error finding container e0dffa25be9f6046a718c62203628da9371c3196496e4c1b95588edfc67bcc0e: Status 404 returned error can't find the container with id e0dffa25be9f6046a718c62203628da9371c3196496e4c1b95588edfc67bcc0e Apr 23 13:35:17.027668 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:17.027567 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ggnhj" event={"ID":"bb994adb-00a6-4601-83e9-80e43ab53049","Type":"ContainerStarted","Data":"e0dffa25be9f6046a718c62203628da9371c3196496e4c1b95588edfc67bcc0e"} Apr 23 13:35:19.036620 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:19.036576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ggnhj" event={"ID":"bb994adb-00a6-4601-83e9-80e43ab53049","Type":"ContainerStarted","Data":"4534e73c006dcb1657f40294cb7f3c00d9b2570a42cc53035c9f12897ffd6bf8"} Apr 23 13:35:19.058457 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:19.058401 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ggnhj" podStartSLOduration=138.610530402 podStartE2EDuration="2m20.058383192s" podCreationTimestamp="2026-04-23 13:32:59 +0000 UTC" firstStartedPulling="2026-04-23 13:35:16.573812604 +0000 UTC m=+170.728735051" lastFinishedPulling="2026-04-23 13:35:18.021665392 +0000 UTC m=+172.176587841" observedRunningTime="2026-04-23 13:35:19.056612495 +0000 UTC m=+173.211534989" watchObservedRunningTime="2026-04-23 13:35:19.058383192 +0000 UTC m=+173.213305672" Apr 23 13:35:20.001312 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:20.001276 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h8fxx" Apr 23 13:35:23.050146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:23.050108 2569 generic.go:358] "Generic (PLEG): container finished" podID="83f8f286-0a1a-4047-8e3d-83c4b68f2209" containerID="1e2e364793d9472f7c8968170365202c683d568fd9c17db0485913ce215a1bb4" exitCode=0 Apr 23 13:35:23.050146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:23.050148 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" event={"ID":"83f8f286-0a1a-4047-8e3d-83c4b68f2209","Type":"ContainerDied","Data":"1e2e364793d9472f7c8968170365202c683d568fd9c17db0485913ce215a1bb4"} Apr 23 13:35:23.050564 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:23.050467 2569 scope.go:117] "RemoveContainer" containerID="1e2e364793d9472f7c8968170365202c683d568fd9c17db0485913ce215a1bb4" Apr 23 13:35:24.055348 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:35:24.055311 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xq8kc" event={"ID":"83f8f286-0a1a-4047-8e3d-83c4b68f2209","Type":"ContainerStarted","Data":"ccb58dd8416ec014309ee46bf40916884f473d2a0927fccade66a809a950b6c6"} Apr 23 13:36:34.357443 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.357403 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-877c55f44-k98r5"] Apr 23 13:36:34.359884 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.359858 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.373163 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.373122 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-877c55f44-k98r5"] Apr 23 13:36:34.455944 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.455894 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-oauth-serving-cert\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.455944 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.455943 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtvqg\" (UniqueName: \"kubernetes.io/projected/04477dbe-0933-485b-8e16-f40aad39322f-kube-api-access-gtvqg\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.456229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.456021 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-serving-cert\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.456229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.456049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-trusted-ca-bundle\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.456229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.456106 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-service-ca\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.456229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.456125 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-oauth-config\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.456229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.456190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-console-config\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.557202 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.557156 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-trusted-ca-bundle\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.557202 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.557208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-service-ca\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.557469 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.557229 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-oauth-config\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.557469 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.557268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-console-config\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.557469 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.557296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-oauth-serving-cert\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.557469 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.557328 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtvqg\" (UniqueName: \"kubernetes.io/projected/04477dbe-0933-485b-8e16-f40aad39322f-kube-api-access-gtvqg\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.557469 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.557376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-serving-cert\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.558018 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.557982 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-service-ca\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.558175 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.558038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-oauth-serving-cert\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.558248 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.558171 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-trusted-ca-bundle\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.558580 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.558553 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-console-config\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.560008 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.559989 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-oauth-config\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.560228 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.560208 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-serving-cert\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.565942 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.565900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtvqg\" (UniqueName: \"kubernetes.io/projected/04477dbe-0933-485b-8e16-f40aad39322f-kube-api-access-gtvqg\") pod \"console-877c55f44-k98r5\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.672194 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.672080 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:34.809879 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:34.809843 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-877c55f44-k98r5"] Apr 23 13:36:34.813567 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:36:34.813532 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04477dbe_0933_485b_8e16_f40aad39322f.slice/crio-b9658c0988fa620ae5b292255e6d17ac3678c4b980a12b701692d4637ff2e3ae WatchSource:0}: Error finding container b9658c0988fa620ae5b292255e6d17ac3678c4b980a12b701692d4637ff2e3ae: Status 404 returned error can't find the container with id b9658c0988fa620ae5b292255e6d17ac3678c4b980a12b701692d4637ff2e3ae Apr 23 13:36:35.263733 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.263688 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-877c55f44-k98r5" event={"ID":"04477dbe-0933-485b-8e16-f40aad39322f","Type":"ContainerStarted","Data":"3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201"} Apr 23 13:36:35.263733 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.263740 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-877c55f44-k98r5" event={"ID":"04477dbe-0933-485b-8e16-f40aad39322f","Type":"ContainerStarted","Data":"b9658c0988fa620ae5b292255e6d17ac3678c4b980a12b701692d4637ff2e3ae"} Apr 23 13:36:35.283403 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.283337 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-877c55f44-k98r5" podStartSLOduration=1.28331787 podStartE2EDuration="1.28331787s" podCreationTimestamp="2026-04-23 13:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:36:35.281943653 +0000 UTC m=+249.436866115" watchObservedRunningTime="2026-04-23 13:36:35.28331787 +0000 UTC m=+249.438240338" Apr 23 13:36:35.805592 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.805558 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kmjjb"] Apr 23 13:36:35.807607 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.807590 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:35.809957 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.809935 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:36:35.815204 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.815180 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kmjjb"] Apr 23 13:36:35.868358 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.868307 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f11d0899-4c12-462d-a8c1-ea18032668a9-kubelet-config\") pod \"global-pull-secret-syncer-kmjjb\" (UID: \"f11d0899-4c12-462d-a8c1-ea18032668a9\") " pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:35.868358 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.868365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f11d0899-4c12-462d-a8c1-ea18032668a9-dbus\") pod \"global-pull-secret-syncer-kmjjb\" (UID: \"f11d0899-4c12-462d-a8c1-ea18032668a9\") " pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:35.868585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.868428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f11d0899-4c12-462d-a8c1-ea18032668a9-original-pull-secret\") pod \"global-pull-secret-syncer-kmjjb\" (UID: \"f11d0899-4c12-462d-a8c1-ea18032668a9\") " pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:35.969347 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.969308 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f11d0899-4c12-462d-a8c1-ea18032668a9-dbus\") pod \"global-pull-secret-syncer-kmjjb\" (UID: \"f11d0899-4c12-462d-a8c1-ea18032668a9\") " pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:35.969347 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.969356 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f11d0899-4c12-462d-a8c1-ea18032668a9-original-pull-secret\") pod \"global-pull-secret-syncer-kmjjb\" (UID: \"f11d0899-4c12-462d-a8c1-ea18032668a9\") " pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:35.969559 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.969411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f11d0899-4c12-462d-a8c1-ea18032668a9-kubelet-config\") pod \"global-pull-secret-syncer-kmjjb\" (UID: \"f11d0899-4c12-462d-a8c1-ea18032668a9\") " pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:35.969559 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.969483 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f11d0899-4c12-462d-a8c1-ea18032668a9-kubelet-config\") pod \"global-pull-secret-syncer-kmjjb\" (UID: \"f11d0899-4c12-462d-a8c1-ea18032668a9\") " pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:35.969559 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.969520 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f11d0899-4c12-462d-a8c1-ea18032668a9-dbus\") pod \"global-pull-secret-syncer-kmjjb\" (UID: \"f11d0899-4c12-462d-a8c1-ea18032668a9\") " pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:35.971952 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:35.971930 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f11d0899-4c12-462d-a8c1-ea18032668a9-original-pull-secret\") pod \"global-pull-secret-syncer-kmjjb\" (UID: \"f11d0899-4c12-462d-a8c1-ea18032668a9\") " pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:36.118472 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:36.118357 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kmjjb" Apr 23 13:36:36.248337 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:36.248300 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kmjjb"] Apr 23 13:36:36.251439 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:36:36.251412 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf11d0899_4c12_462d_a8c1_ea18032668a9.slice/crio-0b69e8572f58c87f7406dac87505d77f2ce10795ddb03e13fab2117c49b32287 WatchSource:0}: Error finding container 0b69e8572f58c87f7406dac87505d77f2ce10795ddb03e13fab2117c49b32287: Status 404 returned error can't find the container with id 0b69e8572f58c87f7406dac87505d77f2ce10795ddb03e13fab2117c49b32287 Apr 23 13:36:36.267991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:36.267956 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kmjjb" event={"ID":"f11d0899-4c12-462d-a8c1-ea18032668a9","Type":"ContainerStarted","Data":"0b69e8572f58c87f7406dac87505d77f2ce10795ddb03e13fab2117c49b32287"} Apr 23 13:36:41.285317 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:41.285280 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kmjjb" event={"ID":"f11d0899-4c12-462d-a8c1-ea18032668a9","Type":"ContainerStarted","Data":"75d4c42beced0eed6d9b1bff6a49544fe85f012394d65aa7c4161aa930017784"} Apr 23 13:36:41.300730 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:41.300676 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kmjjb" podStartSLOduration=2.032126153 podStartE2EDuration="6.300661881s" podCreationTimestamp="2026-04-23 13:36:35 +0000 UTC" firstStartedPulling="2026-04-23 13:36:36.253260911 +0000 UTC m=+250.408183357" lastFinishedPulling="2026-04-23 13:36:40.521796635 +0000 UTC m=+254.676719085" observedRunningTime="2026-04-23 13:36:41.298986673 +0000 UTC m=+255.453909140" watchObservedRunningTime="2026-04-23 13:36:41.300661881 +0000 UTC m=+255.455584360" Apr 23 13:36:44.672981 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:44.672948 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:44.673427 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:44.673028 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:44.677743 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:44.677721 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:45.301817 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:45.301787 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:36:45.349376 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:45.349332 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86569766fc-dqn74"] Apr 23 13:36:54.523119 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.523081 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg"] Apr 23 13:36:54.526463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.526445 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.528892 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.528873 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:36:54.529778 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.529764 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:36:54.529846 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.529775 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mctsq\"" Apr 23 13:36:54.534615 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.534587 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg"] Apr 23 13:36:54.628699 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.628657 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.628887 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.628713 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.628887 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.628761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2bmv\" (UniqueName: \"kubernetes.io/projected/60aa48c5-9702-42b6-b694-919bedc308a4-kube-api-access-s2bmv\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.729744 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.729704 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.729744 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.729748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2bmv\" (UniqueName: \"kubernetes.io/projected/60aa48c5-9702-42b6-b694-919bedc308a4-kube-api-access-s2bmv\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.729940 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.729799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.730180 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.730165 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.730230 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.730164 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.738564 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.738537 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2bmv\" (UniqueName: \"kubernetes.io/projected/60aa48c5-9702-42b6-b694-919bedc308a4-kube-api-access-s2bmv\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.836852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.836763 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:36:54.960648 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:54.960623 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg"] Apr 23 13:36:54.962872 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:36:54.962839 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60aa48c5_9702_42b6_b694_919bedc308a4.slice/crio-2249ce5b86cea04fb587a0a34813e6a5894b63959391ee928cc370f255a0e903 WatchSource:0}: Error finding container 2249ce5b86cea04fb587a0a34813e6a5894b63959391ee928cc370f255a0e903: Status 404 returned error can't find the container with id 2249ce5b86cea04fb587a0a34813e6a5894b63959391ee928cc370f255a0e903 Apr 23 13:36:55.326639 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:36:55.326603 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" event={"ID":"60aa48c5-9702-42b6-b694-919bedc308a4","Type":"ContainerStarted","Data":"2249ce5b86cea04fb587a0a34813e6a5894b63959391ee928cc370f255a0e903"} Apr 23 13:37:00.344112 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:00.344050 2569 generic.go:358] "Generic (PLEG): container finished" podID="60aa48c5-9702-42b6-b694-919bedc308a4" containerID="3f27af7565da4091d2ac911122616558becd4cc1b461669f2b877dc82a2bae06" exitCode=0 Apr 23 13:37:00.344575 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:00.344129 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" event={"ID":"60aa48c5-9702-42b6-b694-919bedc308a4","Type":"ContainerDied","Data":"3f27af7565da4091d2ac911122616558becd4cc1b461669f2b877dc82a2bae06"} Apr 23 13:37:04.358406 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:04.358372 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" event={"ID":"60aa48c5-9702-42b6-b694-919bedc308a4","Type":"ContainerStarted","Data":"3057deb4161c520a3692383b3e9d96b512060abe1ce83f77eecff0d77f184fa1"} Apr 23 13:37:05.363201 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:05.363163 2569 generic.go:358] "Generic (PLEG): container finished" podID="60aa48c5-9702-42b6-b694-919bedc308a4" containerID="3057deb4161c520a3692383b3e9d96b512060abe1ce83f77eecff0d77f184fa1" exitCode=0 Apr 23 13:37:05.363823 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:05.363211 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" event={"ID":"60aa48c5-9702-42b6-b694-919bedc308a4","Type":"ContainerDied","Data":"3057deb4161c520a3692383b3e9d96b512060abe1ce83f77eecff0d77f184fa1"} Apr 23 13:37:10.371708 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:10.371661 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86569766fc-dqn74" podUID="0deeaebe-d2f4-4804-8a26-9829c0d70462" containerName="console" containerID="cri-o://39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431" gracePeriod=15 Apr 23 13:37:11.193487 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.193462 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86569766fc-dqn74_0deeaebe-d2f4-4804-8a26-9829c0d70462/console/0.log" Apr 23 13:37:11.193639 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.193529 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:37:11.275071 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275032 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-oauth-config\") pod \"0deeaebe-d2f4-4804-8a26-9829c0d70462\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " Apr 23 13:37:11.275169 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275092 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-config\") pod \"0deeaebe-d2f4-4804-8a26-9829c0d70462\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " Apr 23 13:37:11.275169 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275153 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-service-ca\") pod \"0deeaebe-d2f4-4804-8a26-9829c0d70462\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " Apr 23 13:37:11.275252 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275210 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-trusted-ca-bundle\") pod \"0deeaebe-d2f4-4804-8a26-9829c0d70462\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " Apr 23 13:37:11.275286 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275251 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbxdw\" (UniqueName: \"kubernetes.io/projected/0deeaebe-d2f4-4804-8a26-9829c0d70462-kube-api-access-sbxdw\") pod \"0deeaebe-d2f4-4804-8a26-9829c0d70462\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " Apr 23 13:37:11.275328 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275287 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-oauth-serving-cert\") pod \"0deeaebe-d2f4-4804-8a26-9829c0d70462\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " Apr 23 13:37:11.275383 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275324 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-serving-cert\") pod \"0deeaebe-d2f4-4804-8a26-9829c0d70462\" (UID: \"0deeaebe-d2f4-4804-8a26-9829c0d70462\") " Apr 23 13:37:11.275462 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275444 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-config" (OuterVolumeSpecName: "console-config") pod "0deeaebe-d2f4-4804-8a26-9829c0d70462" (UID: "0deeaebe-d2f4-4804-8a26-9829c0d70462"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:37:11.275543 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275500 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-service-ca" (OuterVolumeSpecName: "service-ca") pod "0deeaebe-d2f4-4804-8a26-9829c0d70462" (UID: "0deeaebe-d2f4-4804-8a26-9829c0d70462"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:37:11.275660 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275599 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:11.275660 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275619 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-service-ca\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:11.275660 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275593 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0deeaebe-d2f4-4804-8a26-9829c0d70462" (UID: "0deeaebe-d2f4-4804-8a26-9829c0d70462"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:37:11.275899 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.275826 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0deeaebe-d2f4-4804-8a26-9829c0d70462" (UID: "0deeaebe-d2f4-4804-8a26-9829c0d70462"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:37:11.277304 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.277285 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0deeaebe-d2f4-4804-8a26-9829c0d70462" (UID: "0deeaebe-d2f4-4804-8a26-9829c0d70462"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:37:11.277407 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.277383 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0deeaebe-d2f4-4804-8a26-9829c0d70462-kube-api-access-sbxdw" (OuterVolumeSpecName: "kube-api-access-sbxdw") pod "0deeaebe-d2f4-4804-8a26-9829c0d70462" (UID: "0deeaebe-d2f4-4804-8a26-9829c0d70462"). InnerVolumeSpecName "kube-api-access-sbxdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:37:11.277447 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.277415 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0deeaebe-d2f4-4804-8a26-9829c0d70462" (UID: "0deeaebe-d2f4-4804-8a26-9829c0d70462"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:37:11.377050 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.376970 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-serving-cert\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:11.377050 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.376995 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0deeaebe-d2f4-4804-8a26-9829c0d70462-console-oauth-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:11.377050 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.377006 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-trusted-ca-bundle\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:11.377050 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.377015 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbxdw\" (UniqueName: \"kubernetes.io/projected/0deeaebe-d2f4-4804-8a26-9829c0d70462-kube-api-access-sbxdw\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:11.377050 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.377024 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0deeaebe-d2f4-4804-8a26-9829c0d70462-oauth-serving-cert\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:11.383963 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.383930 2569 generic.go:358] "Generic (PLEG): container finished" podID="60aa48c5-9702-42b6-b694-919bedc308a4" containerID="34192739952d51c900adf165fd179728e3fd03e32bb6e6659f7afbf82bbc5723" exitCode=0 Apr 23 13:37:11.384144 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.384009 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" event={"ID":"60aa48c5-9702-42b6-b694-919bedc308a4","Type":"ContainerDied","Data":"34192739952d51c900adf165fd179728e3fd03e32bb6e6659f7afbf82bbc5723"} Apr 23 13:37:11.385123 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.385105 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86569766fc-dqn74_0deeaebe-d2f4-4804-8a26-9829c0d70462/console/0.log" Apr 23 13:37:11.385220 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.385140 2569 generic.go:358] "Generic (PLEG): container finished" podID="0deeaebe-d2f4-4804-8a26-9829c0d70462" containerID="39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431" exitCode=2 Apr 23 13:37:11.385220 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.385181 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86569766fc-dqn74" event={"ID":"0deeaebe-d2f4-4804-8a26-9829c0d70462","Type":"ContainerDied","Data":"39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431"} Apr 23 13:37:11.385220 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.385198 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86569766fc-dqn74" Apr 23 13:37:11.385220 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.385217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86569766fc-dqn74" event={"ID":"0deeaebe-d2f4-4804-8a26-9829c0d70462","Type":"ContainerDied","Data":"c915744d265662fa55d9aebffc8e64d8e92da8b7fc135925eeb663b504d22b53"} Apr 23 13:37:11.385376 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.385233 2569 scope.go:117] "RemoveContainer" containerID="39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431" Apr 23 13:37:11.393811 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.393792 2569 scope.go:117] "RemoveContainer" containerID="39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431" Apr 23 13:37:11.394050 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:37:11.394031 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431\": container with ID starting with 39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431 not found: ID does not exist" containerID="39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431" Apr 23 13:37:11.394157 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.394091 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431"} err="failed to get container status \"39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431\": rpc error: code = NotFound desc = could not find container \"39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431\": container with ID starting with 39a716038e3e0848841d9099cdc7ee30101b4bd966777b076bafe19adac44431 not found: ID does not exist" Apr 23 13:37:11.413456 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.413429 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86569766fc-dqn74"] Apr 23 13:37:11.417128 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:11.417108 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86569766fc-dqn74"] Apr 23 13:37:12.420221 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.420193 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0deeaebe-d2f4-4804-8a26-9829c0d70462" path="/var/lib/kubelet/pods/0deeaebe-d2f4-4804-8a26-9829c0d70462/volumes" Apr 23 13:37:12.511366 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.511345 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:37:12.588438 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.588406 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-bundle\") pod \"60aa48c5-9702-42b6-b694-919bedc308a4\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " Apr 23 13:37:12.588606 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.588444 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-util\") pod \"60aa48c5-9702-42b6-b694-919bedc308a4\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " Apr 23 13:37:12.588606 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.588474 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2bmv\" (UniqueName: \"kubernetes.io/projected/60aa48c5-9702-42b6-b694-919bedc308a4-kube-api-access-s2bmv\") pod \"60aa48c5-9702-42b6-b694-919bedc308a4\" (UID: \"60aa48c5-9702-42b6-b694-919bedc308a4\") " Apr 23 13:37:12.589050 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.589021 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-bundle" (OuterVolumeSpecName: "bundle") pod "60aa48c5-9702-42b6-b694-919bedc308a4" (UID: "60aa48c5-9702-42b6-b694-919bedc308a4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:37:12.590736 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.590711 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60aa48c5-9702-42b6-b694-919bedc308a4-kube-api-access-s2bmv" (OuterVolumeSpecName: "kube-api-access-s2bmv") pod "60aa48c5-9702-42b6-b694-919bedc308a4" (UID: "60aa48c5-9702-42b6-b694-919bedc308a4"). InnerVolumeSpecName "kube-api-access-s2bmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:37:12.592479 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.592455 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-util" (OuterVolumeSpecName: "util") pod "60aa48c5-9702-42b6-b694-919bedc308a4" (UID: "60aa48c5-9702-42b6-b694-919bedc308a4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:37:12.689986 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.689900 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-bundle\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:12.689986 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.689932 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/60aa48c5-9702-42b6-b694-919bedc308a4-util\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:12.689986 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:12.689941 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s2bmv\" (UniqueName: \"kubernetes.io/projected/60aa48c5-9702-42b6-b694-919bedc308a4-kube-api-access-s2bmv\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:37:13.393467 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:13.393427 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" event={"ID":"60aa48c5-9702-42b6-b694-919bedc308a4","Type":"ContainerDied","Data":"2249ce5b86cea04fb587a0a34813e6a5894b63959391ee928cc370f255a0e903"} Apr 23 13:37:13.393467 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:13.393464 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2249ce5b86cea04fb587a0a34813e6a5894b63959391ee928cc370f255a0e903" Apr 23 13:37:13.393467 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:13.393464 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjxkfg" Apr 23 13:37:21.350262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350211 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn"] Apr 23 13:37:21.350633 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350541 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0deeaebe-d2f4-4804-8a26-9829c0d70462" containerName="console" Apr 23 13:37:21.350633 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350553 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0deeaebe-d2f4-4804-8a26-9829c0d70462" containerName="console" Apr 23 13:37:21.350633 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350578 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60aa48c5-9702-42b6-b694-919bedc308a4" containerName="util" Apr 23 13:37:21.350633 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350584 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="60aa48c5-9702-42b6-b694-919bedc308a4" containerName="util" Apr 23 13:37:21.350633 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350593 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60aa48c5-9702-42b6-b694-919bedc308a4" containerName="pull" Apr 23 13:37:21.350633 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350598 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="60aa48c5-9702-42b6-b694-919bedc308a4" containerName="pull" Apr 23 13:37:21.350633 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350604 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60aa48c5-9702-42b6-b694-919bedc308a4" containerName="extract" Apr 23 13:37:21.350633 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350609 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="60aa48c5-9702-42b6-b694-919bedc308a4" containerName="extract" Apr 23 13:37:21.350857 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350660 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0deeaebe-d2f4-4804-8a26-9829c0d70462" containerName="console" Apr 23 13:37:21.350857 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.350671 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="60aa48c5-9702-42b6-b694-919bedc308a4" containerName="extract" Apr 23 13:37:21.353526 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.353507 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:21.356357 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.356329 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 13:37:21.356357 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.356351 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 13:37:21.356519 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.356332 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 13:37:21.356519 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.356406 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 13:37:21.357254 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.357239 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-n5kkt\"" Apr 23 13:37:21.357319 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.357239 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 13:37:21.364189 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.364168 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn"] Apr 23 13:37:21.465277 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.465228 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9n5f\" (UniqueName: \"kubernetes.io/projected/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-kube-api-access-m9n5f\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:21.465464 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.465311 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:21.465464 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.465340 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:21.566433 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.566398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9n5f\" (UniqueName: \"kubernetes.io/projected/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-kube-api-access-m9n5f\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:21.566632 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.566452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:21.566632 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.566480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:21.566632 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:37:21.566607 2569 secret.go:281] references non-existent secret key: tls.crt Apr 23 13:37:21.566632 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:37:21.566625 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 13:37:21.566850 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:37:21.566645 2569 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 23 13:37:21.566850 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:37:21.566668 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 23 13:37:21.566850 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:37:21.566747 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-certificates podName:ca5d0aa0-1205-4d76-a158-63a3a1788fc1 nodeName:}" failed. No retries permitted until 2026-04-23 13:37:22.066725071 +0000 UTC m=+296.221647517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-certificates") pod "keda-metrics-apiserver-7c9f485588-q5qkn" (UID: "ca5d0aa0-1205-4d76-a158-63a3a1788fc1") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 23 13:37:21.567008 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.566886 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:21.583940 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.583904 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9n5f\" (UniqueName: \"kubernetes.io/projected/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-kube-api-access-m9n5f\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:21.618501 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.618420 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-qzs7v"] Apr 23 13:37:21.620818 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.620800 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:37:21.624945 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.624922 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 13:37:21.635177 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.635158 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-qzs7v"] Apr 23 13:37:21.767903 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.767724 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7zf4\" (UniqueName: \"kubernetes.io/projected/44d6b04b-0bcf-42ab-b97f-631f5c85b858-kube-api-access-d7zf4\") pod \"keda-admission-cf49989db-qzs7v\" (UID: \"44d6b04b-0bcf-42ab-b97f-631f5c85b858\") " pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:37:21.767903 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.767812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/44d6b04b-0bcf-42ab-b97f-631f5c85b858-certificates\") pod \"keda-admission-cf49989db-qzs7v\" (UID: \"44d6b04b-0bcf-42ab-b97f-631f5c85b858\") " pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:37:21.868428 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.868377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/44d6b04b-0bcf-42ab-b97f-631f5c85b858-certificates\") pod \"keda-admission-cf49989db-qzs7v\" (UID: \"44d6b04b-0bcf-42ab-b97f-631f5c85b858\") " pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:37:21.868660 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.868496 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7zf4\" (UniqueName: \"kubernetes.io/projected/44d6b04b-0bcf-42ab-b97f-631f5c85b858-kube-api-access-d7zf4\") pod \"keda-admission-cf49989db-qzs7v\" (UID: \"44d6b04b-0bcf-42ab-b97f-631f5c85b858\") " pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:37:21.871596 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.871564 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/44d6b04b-0bcf-42ab-b97f-631f5c85b858-certificates\") pod \"keda-admission-cf49989db-qzs7v\" (UID: \"44d6b04b-0bcf-42ab-b97f-631f5c85b858\") " pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:37:21.877697 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.877673 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7zf4\" (UniqueName: \"kubernetes.io/projected/44d6b04b-0bcf-42ab-b97f-631f5c85b858-kube-api-access-d7zf4\") pod \"keda-admission-cf49989db-qzs7v\" (UID: \"44d6b04b-0bcf-42ab-b97f-631f5c85b858\") " pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:37:21.931738 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:21.931697 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:37:22.069877 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:22.069846 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:22.072639 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:22.072607 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ca5d0aa0-1205-4d76-a158-63a3a1788fc1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-q5qkn\" (UID: \"ca5d0aa0-1205-4d76-a158-63a3a1788fc1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:22.079390 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:22.079318 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-qzs7v"] Apr 23 13:37:22.085527 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:37:22.083493 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d6b04b_0bcf_42ab_b97f_631f5c85b858.slice/crio-2fa7e6ea0ba53c12098d0c6625aa6940bba8a02ead6aaa08309afb5a5096e59a WatchSource:0}: Error finding container 2fa7e6ea0ba53c12098d0c6625aa6940bba8a02ead6aaa08309afb5a5096e59a: Status 404 returned error can't find the container with id 2fa7e6ea0ba53c12098d0c6625aa6940bba8a02ead6aaa08309afb5a5096e59a Apr 23 13:37:22.265113 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:22.265070 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:22.394616 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:22.394586 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn"] Apr 23 13:37:22.397671 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:37:22.397646 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca5d0aa0_1205_4d76_a158_63a3a1788fc1.slice/crio-ae398711873a8bcf77e0b5581e6ab0800bd1169d20c9e9a7d1100c593ef248ab WatchSource:0}: Error finding container ae398711873a8bcf77e0b5581e6ab0800bd1169d20c9e9a7d1100c593ef248ab: Status 404 returned error can't find the container with id ae398711873a8bcf77e0b5581e6ab0800bd1169d20c9e9a7d1100c593ef248ab Apr 23 13:37:22.420727 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:22.420698 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-qzs7v" event={"ID":"44d6b04b-0bcf-42ab-b97f-631f5c85b858","Type":"ContainerStarted","Data":"2fa7e6ea0ba53c12098d0c6625aa6940bba8a02ead6aaa08309afb5a5096e59a"} Apr 23 13:37:22.420858 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:22.420734 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" event={"ID":"ca5d0aa0-1205-4d76-a158-63a3a1788fc1","Type":"ContainerStarted","Data":"ae398711873a8bcf77e0b5581e6ab0800bd1169d20c9e9a7d1100c593ef248ab"} Apr 23 13:37:24.434285 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:24.434246 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-qzs7v" event={"ID":"44d6b04b-0bcf-42ab-b97f-631f5c85b858","Type":"ContainerStarted","Data":"331b7b7aa61be2c9dd42e08521d396a73c1ab644b31b070fa126ac6c25358f33"} Apr 23 13:37:24.434706 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:24.434314 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:37:24.456959 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:24.456903 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-qzs7v" podStartSLOduration=1.673077243 podStartE2EDuration="3.456885655s" podCreationTimestamp="2026-04-23 13:37:21 +0000 UTC" firstStartedPulling="2026-04-23 13:37:22.085353643 +0000 UTC m=+296.240276104" lastFinishedPulling="2026-04-23 13:37:23.86916207 +0000 UTC m=+298.024084516" observedRunningTime="2026-04-23 13:37:24.45643233 +0000 UTC m=+298.611354812" watchObservedRunningTime="2026-04-23 13:37:24.456885655 +0000 UTC m=+298.611808124" Apr 23 13:37:26.296760 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:26.296732 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:37:26.297418 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:26.297400 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:37:26.301765 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:26.301743 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:37:26.301765 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:26.301758 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:37:26.308241 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:26.308225 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:37:26.441551 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:26.441411 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" event={"ID":"ca5d0aa0-1205-4d76-a158-63a3a1788fc1","Type":"ContainerStarted","Data":"6f5e2e1ca1e7b989be278dcaf1de93d4cde7adf3f6286c41044fc933718b29bf"} Apr 23 13:37:26.441551 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:26.441520 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:26.461216 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:26.461163 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" podStartSLOduration=2.3649343050000002 podStartE2EDuration="5.461148244s" podCreationTimestamp="2026-04-23 13:37:21 +0000 UTC" firstStartedPulling="2026-04-23 13:37:22.399019933 +0000 UTC m=+296.553942379" lastFinishedPulling="2026-04-23 13:37:25.495233872 +0000 UTC m=+299.650156318" observedRunningTime="2026-04-23 13:37:26.46050349 +0000 UTC m=+300.615425969" watchObservedRunningTime="2026-04-23 13:37:26.461148244 +0000 UTC m=+300.616070711" Apr 23 13:37:37.449344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:37.449309 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-q5qkn" Apr 23 13:37:45.440243 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:37:45.440209 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-qzs7v" Apr 23 13:38:29.519353 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.519261 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-wp6hx"] Apr 23 13:38:29.522483 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.522463 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:29.524919 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.524885 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 13:38:29.525186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.525168 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:38:29.525382 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.525361 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:38:29.525912 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.525894 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-vt7f7\"" Apr 23 13:38:29.532231 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.532203 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-wp6hx"] Apr 23 13:38:29.548543 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.548510 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-h9v5c"] Apr 23 13:38:29.551680 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.551662 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:38:29.554288 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.554259 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-kkxc7\"" Apr 23 13:38:29.554424 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.554317 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 13:38:29.558904 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.558876 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-h9v5c"] Apr 23 13:38:29.637982 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.637939 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9c9j\" (UniqueName: \"kubernetes.io/projected/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-kube-api-access-r9c9j\") pod \"kserve-controller-manager-6b667fdd66-wp6hx\" (UID: \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\") " pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:29.638178 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.637998 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0a6f64c3-9ae9-493c-9566-8192c9595401-data\") pod \"seaweedfs-86cc847c5c-h9v5c\" (UID: \"0a6f64c3-9ae9-493c-9566-8192c9595401\") " pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:38:29.638178 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.638124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-cert\") pod \"kserve-controller-manager-6b667fdd66-wp6hx\" (UID: \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\") " pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:29.638178 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.638161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdzzh\" (UniqueName: \"kubernetes.io/projected/0a6f64c3-9ae9-493c-9566-8192c9595401-kube-api-access-wdzzh\") pod \"seaweedfs-86cc847c5c-h9v5c\" (UID: \"0a6f64c3-9ae9-493c-9566-8192c9595401\") " pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:38:29.739215 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.739174 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-cert\") pod \"kserve-controller-manager-6b667fdd66-wp6hx\" (UID: \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\") " pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:29.739396 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.739223 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdzzh\" (UniqueName: \"kubernetes.io/projected/0a6f64c3-9ae9-493c-9566-8192c9595401-kube-api-access-wdzzh\") pod \"seaweedfs-86cc847c5c-h9v5c\" (UID: \"0a6f64c3-9ae9-493c-9566-8192c9595401\") " pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:38:29.739396 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.739269 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9c9j\" (UniqueName: \"kubernetes.io/projected/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-kube-api-access-r9c9j\") pod \"kserve-controller-manager-6b667fdd66-wp6hx\" (UID: \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\") " pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:29.739396 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.739293 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0a6f64c3-9ae9-493c-9566-8192c9595401-data\") pod \"seaweedfs-86cc847c5c-h9v5c\" (UID: \"0a6f64c3-9ae9-493c-9566-8192c9595401\") " pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:38:29.739396 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:38:29.739359 2569 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 23 13:38:29.739582 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:38:29.739441 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-cert podName:8df641b6-eadd-4d2d-a451-8ca52ac66f9b nodeName:}" failed. No retries permitted until 2026-04-23 13:38:30.239418217 +0000 UTC m=+364.394340668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-cert") pod "kserve-controller-manager-6b667fdd66-wp6hx" (UID: "8df641b6-eadd-4d2d-a451-8ca52ac66f9b") : secret "kserve-webhook-server-cert" not found Apr 23 13:38:29.739710 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.739682 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0a6f64c3-9ae9-493c-9566-8192c9595401-data\") pod \"seaweedfs-86cc847c5c-h9v5c\" (UID: \"0a6f64c3-9ae9-493c-9566-8192c9595401\") " pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:38:29.750670 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.750632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdzzh\" (UniqueName: \"kubernetes.io/projected/0a6f64c3-9ae9-493c-9566-8192c9595401-kube-api-access-wdzzh\") pod \"seaweedfs-86cc847c5c-h9v5c\" (UID: \"0a6f64c3-9ae9-493c-9566-8192c9595401\") " pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:38:29.750807 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.750688 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9c9j\" (UniqueName: \"kubernetes.io/projected/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-kube-api-access-r9c9j\") pod \"kserve-controller-manager-6b667fdd66-wp6hx\" (UID: \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\") " pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:29.863048 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:29.862958 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:38:30.002128 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:30.002101 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-h9v5c"] Apr 23 13:38:30.004362 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:38:30.004332 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a6f64c3_9ae9_493c_9566_8192c9595401.slice/crio-011cc41a53882f7ce330c868ae02ac35dc889db62e60f0ef732bda6e34635e25 WatchSource:0}: Error finding container 011cc41a53882f7ce330c868ae02ac35dc889db62e60f0ef732bda6e34635e25: Status 404 returned error can't find the container with id 011cc41a53882f7ce330c868ae02ac35dc889db62e60f0ef732bda6e34635e25 Apr 23 13:38:30.005709 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:30.005691 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:38:30.243417 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:30.243273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-cert\") pod \"kserve-controller-manager-6b667fdd66-wp6hx\" (UID: \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\") " pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:30.245733 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:30.245702 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-cert\") pod \"kserve-controller-manager-6b667fdd66-wp6hx\" (UID: \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\") " pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:30.434255 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:30.434214 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:30.591296 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:30.591263 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-wp6hx"] Apr 23 13:38:30.598177 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:38:30.598146 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8df641b6_eadd_4d2d_a451_8ca52ac66f9b.slice/crio-d6965f132c1fec08fe5fbb025cd0195eaa8b93cd8b1e959b90053429c087d848 WatchSource:0}: Error finding container d6965f132c1fec08fe5fbb025cd0195eaa8b93cd8b1e959b90053429c087d848: Status 404 returned error can't find the container with id d6965f132c1fec08fe5fbb025cd0195eaa8b93cd8b1e959b90053429c087d848 Apr 23 13:38:30.635588 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:30.635546 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" event={"ID":"8df641b6-eadd-4d2d-a451-8ca52ac66f9b","Type":"ContainerStarted","Data":"d6965f132c1fec08fe5fbb025cd0195eaa8b93cd8b1e959b90053429c087d848"} Apr 23 13:38:30.636867 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:30.636833 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-h9v5c" event={"ID":"0a6f64c3-9ae9-493c-9566-8192c9595401","Type":"ContainerStarted","Data":"011cc41a53882f7ce330c868ae02ac35dc889db62e60f0ef732bda6e34635e25"} Apr 23 13:38:34.651360 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:34.651321 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" event={"ID":"8df641b6-eadd-4d2d-a451-8ca52ac66f9b","Type":"ContainerStarted","Data":"bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af"} Apr 23 13:38:34.651863 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:34.651437 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:38:34.652809 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:34.652788 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-h9v5c" event={"ID":"0a6f64c3-9ae9-493c-9566-8192c9595401","Type":"ContainerStarted","Data":"cc6219bddb4389968f1954bee22098a23f23bd7d3d819ee95eefc5fdda3fb553"} Apr 23 13:38:34.652917 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:34.652906 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:38:34.678216 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:34.678161 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" podStartSLOduration=2.291790439 podStartE2EDuration="5.678147441s" podCreationTimestamp="2026-04-23 13:38:29 +0000 UTC" firstStartedPulling="2026-04-23 13:38:30.599813922 +0000 UTC m=+364.754736379" lastFinishedPulling="2026-04-23 13:38:33.986170931 +0000 UTC m=+368.141093381" observedRunningTime="2026-04-23 13:38:34.676772694 +0000 UTC m=+368.831695162" watchObservedRunningTime="2026-04-23 13:38:34.678147441 +0000 UTC m=+368.833069910" Apr 23 13:38:34.703819 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:34.703769 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-h9v5c" podStartSLOduration=1.6691696299999998 podStartE2EDuration="5.703751532s" podCreationTimestamp="2026-04-23 13:38:29 +0000 UTC" firstStartedPulling="2026-04-23 13:38:30.005825684 +0000 UTC m=+364.160748130" lastFinishedPulling="2026-04-23 13:38:34.040407587 +0000 UTC m=+368.195330032" observedRunningTime="2026-04-23 13:38:34.702479898 +0000 UTC m=+368.857402572" watchObservedRunningTime="2026-04-23 13:38:34.703751532 +0000 UTC m=+368.858674001" Apr 23 13:38:40.658273 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:38:40.658244 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-h9v5c" Apr 23 13:39:05.661473 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:05.661438 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:39:06.095477 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.095423 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-wp6hx"] Apr 23 13:39:06.095728 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.095689 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" podUID="8df641b6-eadd-4d2d-a451-8ca52ac66f9b" containerName="manager" containerID="cri-o://bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af" gracePeriod=10 Apr 23 13:39:06.131256 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.131223 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-vc7d7"] Apr 23 13:39:06.134408 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.134385 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:06.146381 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.146353 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-vc7d7"] Apr 23 13:39:06.254689 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.254646 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc-cert\") pod \"kserve-controller-manager-6b667fdd66-vc7d7\" (UID: \"f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc\") " pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:06.254841 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.254752 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxqx8\" (UniqueName: \"kubernetes.io/projected/f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc-kube-api-access-dxqx8\") pod \"kserve-controller-manager-6b667fdd66-vc7d7\" (UID: \"f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc\") " pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:06.339239 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.339213 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:39:06.355927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.355837 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxqx8\" (UniqueName: \"kubernetes.io/projected/f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc-kube-api-access-dxqx8\") pod \"kserve-controller-manager-6b667fdd66-vc7d7\" (UID: \"f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc\") " pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:06.356111 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.355945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc-cert\") pod \"kserve-controller-manager-6b667fdd66-vc7d7\" (UID: \"f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc\") " pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:06.358528 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.358495 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc-cert\") pod \"kserve-controller-manager-6b667fdd66-vc7d7\" (UID: \"f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc\") " pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:06.366115 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.366051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxqx8\" (UniqueName: \"kubernetes.io/projected/f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc-kube-api-access-dxqx8\") pod \"kserve-controller-manager-6b667fdd66-vc7d7\" (UID: \"f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc\") " pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:06.456978 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.456944 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9c9j\" (UniqueName: \"kubernetes.io/projected/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-kube-api-access-r9c9j\") pod \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\" (UID: \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\") " Apr 23 13:39:06.457138 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.457031 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-cert\") pod \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\" (UID: \"8df641b6-eadd-4d2d-a451-8ca52ac66f9b\") " Apr 23 13:39:06.459335 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.459310 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-cert" (OuterVolumeSpecName: "cert") pod "8df641b6-eadd-4d2d-a451-8ca52ac66f9b" (UID: "8df641b6-eadd-4d2d-a451-8ca52ac66f9b"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:39:06.459397 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.459310 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-kube-api-access-r9c9j" (OuterVolumeSpecName: "kube-api-access-r9c9j") pod "8df641b6-eadd-4d2d-a451-8ca52ac66f9b" (UID: "8df641b6-eadd-4d2d-a451-8ca52ac66f9b"). InnerVolumeSpecName "kube-api-access-r9c9j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:39:06.475453 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.475424 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:06.558709 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.558668 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-cert\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:39:06.558709 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.558710 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r9c9j\" (UniqueName: \"kubernetes.io/projected/8df641b6-eadd-4d2d-a451-8ca52ac66f9b-kube-api-access-r9c9j\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:39:06.604815 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.604789 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-vc7d7"] Apr 23 13:39:06.607485 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:39:06.607425 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3c1dc8a_6c5d_4dd7_8ad7_c3e0a2c5e7dc.slice/crio-d360af7c9eba00bb37d78f32b41f8c33bf2dddfc0d290a7ceaecaed249966a01 WatchSource:0}: Error finding container d360af7c9eba00bb37d78f32b41f8c33bf2dddfc0d290a7ceaecaed249966a01: Status 404 returned error can't find the container with id d360af7c9eba00bb37d78f32b41f8c33bf2dddfc0d290a7ceaecaed249966a01 Apr 23 13:39:06.754864 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.754831 2569 generic.go:358] "Generic (PLEG): container finished" podID="8df641b6-eadd-4d2d-a451-8ca52ac66f9b" containerID="bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af" exitCode=0 Apr 23 13:39:06.755268 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.754893 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" Apr 23 13:39:06.755268 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.754892 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" event={"ID":"8df641b6-eadd-4d2d-a451-8ca52ac66f9b","Type":"ContainerDied","Data":"bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af"} Apr 23 13:39:06.755268 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.755011 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-wp6hx" event={"ID":"8df641b6-eadd-4d2d-a451-8ca52ac66f9b","Type":"ContainerDied","Data":"d6965f132c1fec08fe5fbb025cd0195eaa8b93cd8b1e959b90053429c087d848"} Apr 23 13:39:06.755268 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.755032 2569 scope.go:117] "RemoveContainer" containerID="bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af" Apr 23 13:39:06.756190 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.756172 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" event={"ID":"f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc","Type":"ContainerStarted","Data":"d360af7c9eba00bb37d78f32b41f8c33bf2dddfc0d290a7ceaecaed249966a01"} Apr 23 13:39:06.763822 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.763806 2569 scope.go:117] "RemoveContainer" containerID="bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af" Apr 23 13:39:06.764098 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:39:06.764077 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af\": container with ID starting with bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af not found: ID does not exist" containerID="bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af" Apr 23 13:39:06.764140 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.764113 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af"} err="failed to get container status \"bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af\": rpc error: code = NotFound desc = could not find container \"bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af\": container with ID starting with bc6bd13b4d1212b4c479f95bdd2b122a180ee3fb7180005e7e6e20e7687609af not found: ID does not exist" Apr 23 13:39:06.779407 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.779387 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-wp6hx"] Apr 23 13:39:06.783605 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:06.783585 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-wp6hx"] Apr 23 13:39:07.761469 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:07.761428 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" event={"ID":"f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc","Type":"ContainerStarted","Data":"d879e27df528f31c1e3590f9b5110f82dfe156e26809692fe452d26883557ac5"} Apr 23 13:39:07.761857 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:07.761522 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:07.779678 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:07.779611 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" podStartSLOduration=1.151477958 podStartE2EDuration="1.779593708s" podCreationTimestamp="2026-04-23 13:39:06 +0000 UTC" firstStartedPulling="2026-04-23 13:39:06.608648858 +0000 UTC m=+400.763571304" lastFinishedPulling="2026-04-23 13:39:07.236764609 +0000 UTC m=+401.391687054" observedRunningTime="2026-04-23 13:39:07.779490781 +0000 UTC m=+401.934413246" watchObservedRunningTime="2026-04-23 13:39:07.779593708 +0000 UTC m=+401.934516176" Apr 23 13:39:08.420624 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:08.420587 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df641b6-eadd-4d2d-a451-8ca52ac66f9b" path="/var/lib/kubelet/pods/8df641b6-eadd-4d2d-a451-8ca52ac66f9b/volumes" Apr 23 13:39:38.770828 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:38.770795 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6b667fdd66-vc7d7" Apr 23 13:39:40.201356 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.201314 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-qt4dm"] Apr 23 13:39:40.201829 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.201811 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8df641b6-eadd-4d2d-a451-8ca52ac66f9b" containerName="manager" Apr 23 13:39:40.201870 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.201834 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df641b6-eadd-4d2d-a451-8ca52ac66f9b" containerName="manager" Apr 23 13:39:40.201942 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.201929 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8df641b6-eadd-4d2d-a451-8ca52ac66f9b" containerName="manager" Apr 23 13:39:40.205084 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.205049 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:40.213150 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.213103 2569 status_manager.go:895] "Failed to get status for pod" podUID="46de531b-554a-4961-893f-3295250ff9f5" pod="kserve/model-serving-api-86f7b4b499-qt4dm" err="pods \"model-serving-api-86f7b4b499-qt4dm\" is forbidden: User \"system:node:ip-10-0-136-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kserve\": no relationship found between node 'ip-10-0-136-158.ec2.internal' and this object" Apr 23 13:39:40.213150 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:39:40.213124 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"model-serving-api-tls\" is forbidden: User \"system:node:ip-10-0-136-158.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve\": no relationship found between node 'ip-10-0-136-158.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve\"/\"model-serving-api-tls\"" type="*v1.Secret" Apr 23 13:39:40.213360 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:39:40.213174 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"model-serving-api-dockercfg-dcvww\" is forbidden: User \"system:node:ip-10-0-136-158.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve\": no relationship found between node 'ip-10-0-136-158.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-dcvww\"" type="*v1.Secret" Apr 23 13:39:40.248366 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.248323 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-qt4dm"] Apr 23 13:39:40.345923 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.345885 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcpms\" (UniqueName: \"kubernetes.io/projected/46de531b-554a-4961-893f-3295250ff9f5-kube-api-access-pcpms\") pod \"model-serving-api-86f7b4b499-qt4dm\" (UID: \"46de531b-554a-4961-893f-3295250ff9f5\") " pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:40.346129 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.345941 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46de531b-554a-4961-893f-3295250ff9f5-tls-certs\") pod \"model-serving-api-86f7b4b499-qt4dm\" (UID: \"46de531b-554a-4961-893f-3295250ff9f5\") " pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:40.446997 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.446954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcpms\" (UniqueName: \"kubernetes.io/projected/46de531b-554a-4961-893f-3295250ff9f5-kube-api-access-pcpms\") pod \"model-serving-api-86f7b4b499-qt4dm\" (UID: \"46de531b-554a-4961-893f-3295250ff9f5\") " pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:40.446997 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.447004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46de531b-554a-4961-893f-3295250ff9f5-tls-certs\") pod \"model-serving-api-86f7b4b499-qt4dm\" (UID: \"46de531b-554a-4961-893f-3295250ff9f5\") " pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:40.459462 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:40.459386 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcpms\" (UniqueName: \"kubernetes.io/projected/46de531b-554a-4961-893f-3295250ff9f5-kube-api-access-pcpms\") pod \"model-serving-api-86f7b4b499-qt4dm\" (UID: \"46de531b-554a-4961-893f-3295250ff9f5\") " pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:41.066409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:41.066316 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 13:39:41.069718 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:41.069691 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46de531b-554a-4961-893f-3295250ff9f5-tls-certs\") pod \"model-serving-api-86f7b4b499-qt4dm\" (UID: \"46de531b-554a-4961-893f-3295250ff9f5\") " pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:41.175107 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:41.175045 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-dcvww\"" Apr 23 13:39:41.179370 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:41.179347 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:41.318266 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:41.318178 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-qt4dm"] Apr 23 13:39:41.321460 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:39:41.321432 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46de531b_554a_4961_893f_3295250ff9f5.slice/crio-e722dece77c8963df991093ecd668630f7a0eb1bade21d5c40175f9a4cc4de4d WatchSource:0}: Error finding container e722dece77c8963df991093ecd668630f7a0eb1bade21d5c40175f9a4cc4de4d: Status 404 returned error can't find the container with id e722dece77c8963df991093ecd668630f7a0eb1bade21d5c40175f9a4cc4de4d Apr 23 13:39:41.869770 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:41.869736 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-qt4dm" event={"ID":"46de531b-554a-4961-893f-3295250ff9f5","Type":"ContainerStarted","Data":"e722dece77c8963df991093ecd668630f7a0eb1bade21d5c40175f9a4cc4de4d"} Apr 23 13:39:42.875042 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:42.875003 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-qt4dm" event={"ID":"46de531b-554a-4961-893f-3295250ff9f5","Type":"ContainerStarted","Data":"528d3d314097aefe0fa38d67eb7ec6299c4167a0458465b00d88de9015aaa999"} Apr 23 13:39:42.875498 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:42.875123 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:42.894581 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:42.894516 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-qt4dm" podStartSLOduration=1.458785342 podStartE2EDuration="2.894497206s" podCreationTimestamp="2026-04-23 13:39:40 +0000 UTC" firstStartedPulling="2026-04-23 13:39:41.323160206 +0000 UTC m=+435.478082656" lastFinishedPulling="2026-04-23 13:39:42.758872066 +0000 UTC m=+436.913794520" observedRunningTime="2026-04-23 13:39:42.892447093 +0000 UTC m=+437.047369559" watchObservedRunningTime="2026-04-23 13:39:42.894497206 +0000 UTC m=+437.049419674" Apr 23 13:39:53.882539 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:53.882504 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-qt4dm" Apr 23 13:39:57.656112 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:57.656072 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-tlks5"] Apr 23 13:39:57.659428 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:57.659408 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tlks5" Apr 23 13:39:57.673208 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:57.673176 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-tlks5"] Apr 23 13:39:57.684261 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:57.684219 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2hjh\" (UniqueName: \"kubernetes.io/projected/2b361ebc-24be-497f-b2d7-4dfd4255a3f9-kube-api-access-k2hjh\") pod \"s3-init-tlks5\" (UID: \"2b361ebc-24be-497f-b2d7-4dfd4255a3f9\") " pod="kserve/s3-init-tlks5" Apr 23 13:39:57.784971 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:57.784919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2hjh\" (UniqueName: \"kubernetes.io/projected/2b361ebc-24be-497f-b2d7-4dfd4255a3f9-kube-api-access-k2hjh\") pod \"s3-init-tlks5\" (UID: \"2b361ebc-24be-497f-b2d7-4dfd4255a3f9\") " pod="kserve/s3-init-tlks5" Apr 23 13:39:57.795697 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:57.795660 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2hjh\" (UniqueName: \"kubernetes.io/projected/2b361ebc-24be-497f-b2d7-4dfd4255a3f9-kube-api-access-k2hjh\") pod \"s3-init-tlks5\" (UID: \"2b361ebc-24be-497f-b2d7-4dfd4255a3f9\") " pod="kserve/s3-init-tlks5" Apr 23 13:39:57.980518 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:57.980433 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tlks5" Apr 23 13:39:58.113872 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:58.113839 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-tlks5"] Apr 23 13:39:58.117019 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:39:58.116989 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b361ebc_24be_497f_b2d7_4dfd4255a3f9.slice/crio-4932af92be6804cec06c431360bc5fae5f08ffcae74026663eb3fc1fd74cfcd8 WatchSource:0}: Error finding container 4932af92be6804cec06c431360bc5fae5f08ffcae74026663eb3fc1fd74cfcd8: Status 404 returned error can't find the container with id 4932af92be6804cec06c431360bc5fae5f08ffcae74026663eb3fc1fd74cfcd8 Apr 23 13:39:58.931300 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:58.931254 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tlks5" event={"ID":"2b361ebc-24be-497f-b2d7-4dfd4255a3f9","Type":"ContainerStarted","Data":"4932af92be6804cec06c431360bc5fae5f08ffcae74026663eb3fc1fd74cfcd8"} Apr 23 13:39:59.673184 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:39:59.672927 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-877c55f44-k98r5"] Apr 23 13:40:02.948200 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:02.948103 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tlks5" event={"ID":"2b361ebc-24be-497f-b2d7-4dfd4255a3f9","Type":"ContainerStarted","Data":"563a4c49769daf7bcf29a3b45071d96608edcb008ccbc92fd184fa07cf220e34"} Apr 23 13:40:02.972561 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:02.972502 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-tlks5" podStartSLOduration=1.428343899 podStartE2EDuration="5.972487593s" podCreationTimestamp="2026-04-23 13:39:57 +0000 UTC" firstStartedPulling="2026-04-23 13:39:58.118843579 +0000 UTC m=+452.273766025" lastFinishedPulling="2026-04-23 13:40:02.662987269 +0000 UTC m=+456.817909719" observedRunningTime="2026-04-23 13:40:02.971471553 +0000 UTC m=+457.126394023" watchObservedRunningTime="2026-04-23 13:40:02.972487593 +0000 UTC m=+457.127410061" Apr 23 13:40:05.959106 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:05.959034 2569 generic.go:358] "Generic (PLEG): container finished" podID="2b361ebc-24be-497f-b2d7-4dfd4255a3f9" containerID="563a4c49769daf7bcf29a3b45071d96608edcb008ccbc92fd184fa07cf220e34" exitCode=0 Apr 23 13:40:05.959472 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:05.959117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tlks5" event={"ID":"2b361ebc-24be-497f-b2d7-4dfd4255a3f9","Type":"ContainerDied","Data":"563a4c49769daf7bcf29a3b45071d96608edcb008ccbc92fd184fa07cf220e34"} Apr 23 13:40:07.091344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:07.091318 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tlks5" Apr 23 13:40:07.165177 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:07.165147 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2hjh\" (UniqueName: \"kubernetes.io/projected/2b361ebc-24be-497f-b2d7-4dfd4255a3f9-kube-api-access-k2hjh\") pod \"2b361ebc-24be-497f-b2d7-4dfd4255a3f9\" (UID: \"2b361ebc-24be-497f-b2d7-4dfd4255a3f9\") " Apr 23 13:40:07.167456 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:07.167434 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b361ebc-24be-497f-b2d7-4dfd4255a3f9-kube-api-access-k2hjh" (OuterVolumeSpecName: "kube-api-access-k2hjh") pod "2b361ebc-24be-497f-b2d7-4dfd4255a3f9" (UID: "2b361ebc-24be-497f-b2d7-4dfd4255a3f9"). InnerVolumeSpecName "kube-api-access-k2hjh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:40:07.266511 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:07.266473 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k2hjh\" (UniqueName: \"kubernetes.io/projected/2b361ebc-24be-497f-b2d7-4dfd4255a3f9-kube-api-access-k2hjh\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:40:07.966969 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:07.966932 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tlks5" event={"ID":"2b361ebc-24be-497f-b2d7-4dfd4255a3f9","Type":"ContainerDied","Data":"4932af92be6804cec06c431360bc5fae5f08ffcae74026663eb3fc1fd74cfcd8"} Apr 23 13:40:07.966969 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:07.966969 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4932af92be6804cec06c431360bc5fae5f08ffcae74026663eb3fc1fd74cfcd8" Apr 23 13:40:07.966969 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:07.966943 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tlks5" Apr 23 13:40:18.426042 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.426009 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8"] Apr 23 13:40:18.426462 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.426359 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b361ebc-24be-497f-b2d7-4dfd4255a3f9" containerName="s3-init" Apr 23 13:40:18.426462 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.426372 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b361ebc-24be-497f-b2d7-4dfd4255a3f9" containerName="s3-init" Apr 23 13:40:18.426462 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.426425 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b361ebc-24be-497f-b2d7-4dfd4255a3f9" containerName="s3-init" Apr 23 13:40:18.428731 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.428713 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.431921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.431897 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-ee9be-predictor-serving-cert\"" Apr 23 13:40:18.432363 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.432343 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:40:18.432460 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.432347 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-blqr2\"" Apr 23 13:40:18.432695 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.432677 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:40:18.432741 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.432677 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\"" Apr 23 13:40:18.444308 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.444286 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8"] Apr 23 13:40:18.563561 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.563523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.563762 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.563572 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.563762 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.563607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-proxy-tls\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.563906 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.563877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mthjj\" (UniqueName: \"kubernetes.io/projected/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kube-api-access-mthjj\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.665132 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.665048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-proxy-tls\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.665322 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.665178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mthjj\" (UniqueName: \"kubernetes.io/projected/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kube-api-access-mthjj\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.665322 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.665210 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.665322 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.665240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.665662 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.665637 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.665897 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.665877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.667685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.667664 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-proxy-tls\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.673913 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.673879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mthjj\" (UniqueName: \"kubernetes.io/projected/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kube-api-access-mthjj\") pod \"isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.739760 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.739658 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:18.870989 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:18.870961 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8"] Apr 23 13:40:18.873916 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:40:18.873888 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5c6f975_4af9_4a6c_b5c6_abe6fc720e00.slice/crio-6b63da37f2fe594de5e2e49674f527e1d66bd52d751d65d1e6c33c00147f92fa WatchSource:0}: Error finding container 6b63da37f2fe594de5e2e49674f527e1d66bd52d751d65d1e6c33c00147f92fa: Status 404 returned error can't find the container with id 6b63da37f2fe594de5e2e49674f527e1d66bd52d751d65d1e6c33c00147f92fa Apr 23 13:40:19.004205 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:19.004171 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerStarted","Data":"6b63da37f2fe594de5e2e49674f527e1d66bd52d751d65d1e6c33c00147f92fa"} Apr 23 13:40:23.020970 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:23.020925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerStarted","Data":"b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd"} Apr 23 13:40:24.700264 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:24.700205 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-877c55f44-k98r5" podUID="04477dbe-0933-485b-8e16-f40aad39322f" containerName="console" containerID="cri-o://3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201" gracePeriod=15 Apr 23 13:40:24.949135 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:24.949111 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-877c55f44-k98r5_04477dbe-0933-485b-8e16-f40aad39322f/console/0.log" Apr 23 13:40:24.949252 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:24.949172 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:40:25.021335 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021303 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-oauth-serving-cert\") pod \"04477dbe-0933-485b-8e16-f40aad39322f\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " Apr 23 13:40:25.021500 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021352 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-oauth-config\") pod \"04477dbe-0933-485b-8e16-f40aad39322f\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " Apr 23 13:40:25.021500 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021402 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-service-ca\") pod \"04477dbe-0933-485b-8e16-f40aad39322f\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " Apr 23 13:40:25.021500 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021440 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-serving-cert\") pod \"04477dbe-0933-485b-8e16-f40aad39322f\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " Apr 23 13:40:25.021500 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021458 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-trusted-ca-bundle\") pod \"04477dbe-0933-485b-8e16-f40aad39322f\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " Apr 23 13:40:25.021500 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021491 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-console-config\") pod \"04477dbe-0933-485b-8e16-f40aad39322f\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " Apr 23 13:40:25.021747 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021516 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtvqg\" (UniqueName: \"kubernetes.io/projected/04477dbe-0933-485b-8e16-f40aad39322f-kube-api-access-gtvqg\") pod \"04477dbe-0933-485b-8e16-f40aad39322f\" (UID: \"04477dbe-0933-485b-8e16-f40aad39322f\") " Apr 23 13:40:25.021861 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021823 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "04477dbe-0933-485b-8e16-f40aad39322f" (UID: "04477dbe-0933-485b-8e16-f40aad39322f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:40:25.021943 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021912 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-console-config" (OuterVolumeSpecName: "console-config") pod "04477dbe-0933-485b-8e16-f40aad39322f" (UID: "04477dbe-0933-485b-8e16-f40aad39322f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:40:25.021943 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021923 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-service-ca" (OuterVolumeSpecName: "service-ca") pod "04477dbe-0933-485b-8e16-f40aad39322f" (UID: "04477dbe-0933-485b-8e16-f40aad39322f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:40:25.022013 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.021940 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "04477dbe-0933-485b-8e16-f40aad39322f" (UID: "04477dbe-0933-485b-8e16-f40aad39322f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:40:25.023784 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.023759 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "04477dbe-0933-485b-8e16-f40aad39322f" (UID: "04477dbe-0933-485b-8e16-f40aad39322f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:40:25.023880 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.023793 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "04477dbe-0933-485b-8e16-f40aad39322f" (UID: "04477dbe-0933-485b-8e16-f40aad39322f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:40:25.023919 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.023885 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04477dbe-0933-485b-8e16-f40aad39322f-kube-api-access-gtvqg" (OuterVolumeSpecName: "kube-api-access-gtvqg") pod "04477dbe-0933-485b-8e16-f40aad39322f" (UID: "04477dbe-0933-485b-8e16-f40aad39322f"). InnerVolumeSpecName "kube-api-access-gtvqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:40:25.029530 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.029515 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-877c55f44-k98r5_04477dbe-0933-485b-8e16-f40aad39322f/console/0.log" Apr 23 13:40:25.029584 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.029550 2569 generic.go:358] "Generic (PLEG): container finished" podID="04477dbe-0933-485b-8e16-f40aad39322f" containerID="3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201" exitCode=2 Apr 23 13:40:25.029648 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.029630 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-877c55f44-k98r5" Apr 23 13:40:25.029726 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.029635 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-877c55f44-k98r5" event={"ID":"04477dbe-0933-485b-8e16-f40aad39322f","Type":"ContainerDied","Data":"3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201"} Apr 23 13:40:25.029782 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.029744 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-877c55f44-k98r5" event={"ID":"04477dbe-0933-485b-8e16-f40aad39322f","Type":"ContainerDied","Data":"b9658c0988fa620ae5b292255e6d17ac3678c4b980a12b701692d4637ff2e3ae"} Apr 23 13:40:25.029782 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.029765 2569 scope.go:117] "RemoveContainer" containerID="3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201" Apr 23 13:40:25.043718 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.043702 2569 scope.go:117] "RemoveContainer" containerID="3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201" Apr 23 13:40:25.043961 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:40:25.043942 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201\": container with ID starting with 3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201 not found: ID does not exist" containerID="3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201" Apr 23 13:40:25.044020 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.043976 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201"} err="failed to get container status \"3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201\": rpc error: code = NotFound desc = could not find container \"3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201\": container with ID starting with 3f2bbe998c107add84ead9dcb150664c981a0c1a849dfda2a277ccd7f467a201 not found: ID does not exist" Apr 23 13:40:25.054793 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.054766 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-877c55f44-k98r5"] Apr 23 13:40:25.058975 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.058944 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-877c55f44-k98r5"] Apr 23 13:40:25.122570 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.122532 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-serving-cert\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:40:25.122570 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.122564 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-trusted-ca-bundle\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:40:25.122570 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.122578 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-console-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:40:25.122803 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.122594 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gtvqg\" (UniqueName: \"kubernetes.io/projected/04477dbe-0933-485b-8e16-f40aad39322f-kube-api-access-gtvqg\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:40:25.122803 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.122611 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-oauth-serving-cert\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:40:25.122803 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.122623 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04477dbe-0933-485b-8e16-f40aad39322f-console-oauth-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:40:25.122803 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:25.122635 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04477dbe-0933-485b-8e16-f40aad39322f-service-ca\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:40:26.034883 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:26.034853 2569 generic.go:358] "Generic (PLEG): container finished" podID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerID="b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd" exitCode=0 Apr 23 13:40:26.035276 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:26.034938 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerDied","Data":"b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd"} Apr 23 13:40:26.422176 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:26.422144 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04477dbe-0933-485b-8e16-f40aad39322f" path="/var/lib/kubelet/pods/04477dbe-0933-485b-8e16-f40aad39322f/volumes" Apr 23 13:40:40.093729 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:40.093694 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerStarted","Data":"7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d"} Apr 23 13:40:43.107510 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:43.107471 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerStarted","Data":"33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01"} Apr 23 13:40:45.117003 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:45.116962 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerStarted","Data":"3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657"} Apr 23 13:40:45.117405 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:45.117244 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:45.117405 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:45.117381 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:45.118699 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:45.118671 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:40:45.139212 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:45.139152 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podStartSLOduration=1.368153141 podStartE2EDuration="27.139136706s" podCreationTimestamp="2026-04-23 13:40:18 +0000 UTC" firstStartedPulling="2026-04-23 13:40:18.875895822 +0000 UTC m=+473.030818268" lastFinishedPulling="2026-04-23 13:40:44.646879372 +0000 UTC m=+498.801801833" observedRunningTime="2026-04-23 13:40:45.137161833 +0000 UTC m=+499.292084303" watchObservedRunningTime="2026-04-23 13:40:45.139136706 +0000 UTC m=+499.294059185" Apr 23 13:40:46.120843 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:46.120813 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:46.121323 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:46.121025 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:40:46.121814 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:46.121792 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:47.124270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:47.124226 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:40:47.124767 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:47.124660 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:47.128449 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:47.128421 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:40:48.128077 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:48.128021 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:40:48.128476 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:48.128344 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:58.128005 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:58.127953 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:40:58.128502 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:40:58.128460 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:08.128038 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:08.127982 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:41:08.128523 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:08.128461 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:18.128545 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:18.128454 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:41:18.128975 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:18.128951 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:28.128757 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:28.128699 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:41:28.129300 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:28.129247 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:38.128304 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:38.128253 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:41:38.128795 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:38.128754 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:41:48.129275 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:48.129236 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:41:48.129854 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:41:48.129440 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:42:03.427731 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.427695 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8"] Apr 23 13:42:03.428236 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.428184 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" containerID="cri-o://7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d" gracePeriod=30 Apr 23 13:42:03.429456 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.428784 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" containerID="cri-o://33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01" gracePeriod=30 Apr 23 13:42:03.429456 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.428973 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" containerID="cri-o://3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657" gracePeriod=30 Apr 23 13:42:03.563407 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.563364 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r"] Apr 23 13:42:03.563709 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.563697 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04477dbe-0933-485b-8e16-f40aad39322f" containerName="console" Apr 23 13:42:03.563773 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.563712 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="04477dbe-0933-485b-8e16-f40aad39322f" containerName="console" Apr 23 13:42:03.563773 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.563771 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="04477dbe-0933-485b-8e16-f40aad39322f" containerName="console" Apr 23 13:42:03.567045 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.567024 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.569991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.569965 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\"" Apr 23 13:42:03.570139 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.569970 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-b90a6-predictor-serving-cert\"" Apr 23 13:42:03.577769 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.577745 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r"] Apr 23 13:42:03.632099 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.632051 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98"] Apr 23 13:42:03.635760 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.635743 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.638501 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.638478 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\"" Apr 23 13:42:03.638614 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.638514 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-b90a6-predictor-serving-cert\"" Apr 23 13:42:03.647259 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.647239 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98"] Apr 23 13:42:03.656849 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.656822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.656979 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.656870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-proxy-tls\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.656979 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.656947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.657114 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.657021 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b4n8\" (UniqueName: \"kubernetes.io/projected/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kube-api-access-5b4n8\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.757877 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.757831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.757877 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.757875 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-proxy-tls\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.758207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.757894 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.758207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.757918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9ba9358-4436-47f4-9437-5acc89fd41c9-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.758207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.757942 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9ba9358-4436-47f4-9437-5acc89fd41c9-isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.758207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.757972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b4n8\" (UniqueName: \"kubernetes.io/projected/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kube-api-access-5b4n8\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.758207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.758008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htk77\" (UniqueName: \"kubernetes.io/projected/a9ba9358-4436-47f4-9437-5acc89fd41c9-kube-api-access-htk77\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.758207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.758121 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9ba9358-4436-47f4-9437-5acc89fd41c9-proxy-tls\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.758483 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.758356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.758678 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.758659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.760635 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.760616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-proxy-tls\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.776216 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.776186 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b4n8\" (UniqueName: \"kubernetes.io/projected/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kube-api-access-5b4n8\") pod \"isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.858763 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.858677 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9ba9358-4436-47f4-9437-5acc89fd41c9-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.858763 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.858722 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9ba9358-4436-47f4-9437-5acc89fd41c9-isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.858763 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.858755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htk77\" (UniqueName: \"kubernetes.io/projected/a9ba9358-4436-47f4-9437-5acc89fd41c9-kube-api-access-htk77\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.858999 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.858774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9ba9358-4436-47f4-9437-5acc89fd41c9-proxy-tls\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.859171 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.859145 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9ba9358-4436-47f4-9437-5acc89fd41c9-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.859450 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.859432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9ba9358-4436-47f4-9437-5acc89fd41c9-isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.861572 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.861548 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9ba9358-4436-47f4-9437-5acc89fd41c9-proxy-tls\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.867763 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.867735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htk77\" (UniqueName: \"kubernetes.io/projected/a9ba9358-4436-47f4-9437-5acc89fd41c9-kube-api-access-htk77\") pod \"isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:03.879515 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.879491 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:03.948670 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:03.948641 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:04.014049 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:04.014003 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r"] Apr 23 13:42:04.019421 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:42:04.019390 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f8f8aad_20d9_48c2_ab5d_741e0d08fa69.slice/crio-e82b3626d04f1b631e3f6b19703eb7b51d1cebfe9cda32d8e1fe1df705d39a8a WatchSource:0}: Error finding container e82b3626d04f1b631e3f6b19703eb7b51d1cebfe9cda32d8e1fe1df705d39a8a: Status 404 returned error can't find the container with id e82b3626d04f1b631e3f6b19703eb7b51d1cebfe9cda32d8e1fe1df705d39a8a Apr 23 13:42:04.089647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:04.089586 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98"] Apr 23 13:42:04.092355 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:42:04.092323 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9ba9358_4436_47f4_9437_5acc89fd41c9.slice/crio-cd1a6ea879074c512079f05c70fff9951f9c7b205d5f502527db38daa7805716 WatchSource:0}: Error finding container cd1a6ea879074c512079f05c70fff9951f9c7b205d5f502527db38daa7805716: Status 404 returned error can't find the container with id cd1a6ea879074c512079f05c70fff9951f9c7b205d5f502527db38daa7805716 Apr 23 13:42:04.383106 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:04.382981 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" event={"ID":"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69","Type":"ContainerStarted","Data":"b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e"} Apr 23 13:42:04.383106 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:04.383027 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" event={"ID":"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69","Type":"ContainerStarted","Data":"e82b3626d04f1b631e3f6b19703eb7b51d1cebfe9cda32d8e1fe1df705d39a8a"} Apr 23 13:42:04.385458 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:04.385429 2569 generic.go:358] "Generic (PLEG): container finished" podID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerID="33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01" exitCode=2 Apr 23 13:42:04.385591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:04.385497 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerDied","Data":"33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01"} Apr 23 13:42:04.387051 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:04.387014 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" event={"ID":"a9ba9358-4436-47f4-9437-5acc89fd41c9","Type":"ContainerStarted","Data":"508830a0b6a3c112d223bfb694984cccac275732fc1d5e609cc0f455284a522c"} Apr 23 13:42:04.387177 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:04.387048 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" event={"ID":"a9ba9358-4436-47f4-9437-5acc89fd41c9","Type":"ContainerStarted","Data":"cd1a6ea879074c512079f05c70fff9951f9c7b205d5f502527db38daa7805716"} Apr 23 13:42:07.124479 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:07.124431 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 23 13:42:08.128252 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:08.128203 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:42:08.128617 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:08.128501 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:42:08.406012 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:08.405964 2569 generic.go:358] "Generic (PLEG): container finished" podID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerID="7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d" exitCode=0 Apr 23 13:42:08.406201 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:08.406046 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerDied","Data":"7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d"} Apr 23 13:42:08.407561 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:08.407538 2569 generic.go:358] "Generic (PLEG): container finished" podID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerID="508830a0b6a3c112d223bfb694984cccac275732fc1d5e609cc0f455284a522c" exitCode=0 Apr 23 13:42:08.407682 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:08.407611 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" event={"ID":"a9ba9358-4436-47f4-9437-5acc89fd41c9","Type":"ContainerDied","Data":"508830a0b6a3c112d223bfb694984cccac275732fc1d5e609cc0f455284a522c"} Apr 23 13:42:08.408950 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:08.408931 2569 generic.go:358] "Generic (PLEG): container finished" podID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerID="b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e" exitCode=0 Apr 23 13:42:08.409042 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:08.408992 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" event={"ID":"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69","Type":"ContainerDied","Data":"b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e"} Apr 23 13:42:09.416216 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:09.416179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" event={"ID":"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69","Type":"ContainerStarted","Data":"06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a"} Apr 23 13:42:09.416216 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:09.416221 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" event={"ID":"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69","Type":"ContainerStarted","Data":"6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841"} Apr 23 13:42:09.416739 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:09.416400 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:09.433526 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:09.433466 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podStartSLOduration=6.433447581 podStartE2EDuration="6.433447581s" podCreationTimestamp="2026-04-23 13:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:42:09.433199431 +0000 UTC m=+583.588121919" watchObservedRunningTime="2026-04-23 13:42:09.433447581 +0000 UTC m=+583.588370049" Apr 23 13:42:10.425832 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:10.425784 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:42:10.427777 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:10.427744 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:11.426549 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:11.426501 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:42:12.125214 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:12.125168 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 23 13:42:16.432422 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:16.432382 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:42:16.432980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:16.432950 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:42:17.124726 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:17.124683 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 23 13:42:17.124912 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:17.124812 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:42:18.128842 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:18.128769 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:42:18.129328 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:18.129199 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:42:22.124773 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:22.124721 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 23 13:42:26.433669 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:26.433621 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:42:26.938309 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:26.938277 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:42:26.940072 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:26.939479 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:42:26.942393 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:26.942369 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:42:26.943403 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:26.943386 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:42:27.124860 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:27.124817 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 23 13:42:27.487916 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:27.487881 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" event={"ID":"a9ba9358-4436-47f4-9437-5acc89fd41c9","Type":"ContainerStarted","Data":"d3f678b5b5ee81e9abe350465a434114fa4581ed0409ad8521dba29817cb32ac"} Apr 23 13:42:27.488428 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:27.487925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" event={"ID":"a9ba9358-4436-47f4-9437-5acc89fd41c9","Type":"ContainerStarted","Data":"79b83fcfd71e081057f27c3fb99927c33cba1595fba96bbf8636034eee7d0c76"} Apr 23 13:42:27.488428 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:27.488174 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:27.506711 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:27.506653 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podStartSLOduration=5.883417807 podStartE2EDuration="24.506636904s" podCreationTimestamp="2026-04-23 13:42:03 +0000 UTC" firstStartedPulling="2026-04-23 13:42:08.409090848 +0000 UTC m=+582.564013293" lastFinishedPulling="2026-04-23 13:42:27.032309945 +0000 UTC m=+601.187232390" observedRunningTime="2026-04-23 13:42:27.505677154 +0000 UTC m=+601.660599634" watchObservedRunningTime="2026-04-23 13:42:27.506636904 +0000 UTC m=+601.661559373" Apr 23 13:42:28.128705 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:28.128654 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 23 13:42:28.128913 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:28.128820 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:42:28.129043 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:28.129004 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:42:28.129184 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:28.129169 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:42:28.491839 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:28.491745 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:28.493030 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:28.493002 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:42:29.495022 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:29.494981 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:42:32.125388 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:32.125336 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 23 13:42:34.098713 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.098688 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:42:34.132495 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.132462 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mthjj\" (UniqueName: \"kubernetes.io/projected/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kube-api-access-mthjj\") pod \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " Apr 23 13:42:34.132676 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.132518 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\") pod \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " Apr 23 13:42:34.132676 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.132557 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kserve-provision-location\") pod \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " Apr 23 13:42:34.132676 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.132595 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-proxy-tls\") pod \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\" (UID: \"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00\") " Apr 23 13:42:34.132904 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.132881 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config") pod "c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" (UID: "c5c6f975-4af9-4a6c-b5c6-abe6fc720e00"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:42:34.132904 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.132885 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" (UID: "c5c6f975-4af9-4a6c-b5c6-abe6fc720e00"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:42:34.134976 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.134950 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kube-api-access-mthjj" (OuterVolumeSpecName: "kube-api-access-mthjj") pod "c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" (UID: "c5c6f975-4af9-4a6c-b5c6-abe6fc720e00"). InnerVolumeSpecName "kube-api-access-mthjj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:42:34.134976 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.134967 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" (UID: "c5c6f975-4af9-4a6c-b5c6-abe6fc720e00"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:42:34.233604 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.233562 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mthjj\" (UniqueName: \"kubernetes.io/projected/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kube-api-access-mthjj\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:42:34.233604 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.233602 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-isvc-raw-sklearn-batcher-ee9be-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:42:34.233811 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.233617 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:42:34.233811 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.233630 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:42:34.499274 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.499243 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:42:34.499743 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.499718 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:42:34.515045 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.515009 2569 generic.go:358] "Generic (PLEG): container finished" podID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerID="3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657" exitCode=0 Apr 23 13:42:34.515229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.515096 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerDied","Data":"3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657"} Apr 23 13:42:34.515229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.515135 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" event={"ID":"c5c6f975-4af9-4a6c-b5c6-abe6fc720e00","Type":"ContainerDied","Data":"6b63da37f2fe594de5e2e49674f527e1d66bd52d751d65d1e6c33c00147f92fa"} Apr 23 13:42:34.515229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.515143 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8" Apr 23 13:42:34.515229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.515152 2569 scope.go:117] "RemoveContainer" containerID="3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657" Apr 23 13:42:34.523915 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.523890 2569 scope.go:117] "RemoveContainer" containerID="33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01" Apr 23 13:42:34.531813 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.531793 2569 scope.go:117] "RemoveContainer" containerID="7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d" Apr 23 13:42:34.534873 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.534845 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8"] Apr 23 13:42:34.540079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.540035 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ee9be-predictor-7868f7cf68-29tt8"] Apr 23 13:42:34.540805 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.540789 2569 scope.go:117] "RemoveContainer" containerID="b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd" Apr 23 13:42:34.548202 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.548180 2569 scope.go:117] "RemoveContainer" containerID="3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657" Apr 23 13:42:34.548488 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:42:34.548471 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657\": container with ID starting with 3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657 not found: ID does not exist" containerID="3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657" Apr 23 13:42:34.548543 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.548496 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657"} err="failed to get container status \"3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657\": rpc error: code = NotFound desc = could not find container \"3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657\": container with ID starting with 3283873568be3e89a62fe5fac54090d7a274ef0e23d41c124288170df891e657 not found: ID does not exist" Apr 23 13:42:34.548543 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.548514 2569 scope.go:117] "RemoveContainer" containerID="33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01" Apr 23 13:42:34.548745 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:42:34.548729 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01\": container with ID starting with 33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01 not found: ID does not exist" containerID="33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01" Apr 23 13:42:34.548791 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.548750 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01"} err="failed to get container status \"33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01\": rpc error: code = NotFound desc = could not find container \"33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01\": container with ID starting with 33b2ef4a1527a0af2c49e8d94bafaf897de768218588a170389862c7fe97ab01 not found: ID does not exist" Apr 23 13:42:34.548791 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.548765 2569 scope.go:117] "RemoveContainer" containerID="7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d" Apr 23 13:42:34.549000 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:42:34.548986 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d\": container with ID starting with 7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d not found: ID does not exist" containerID="7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d" Apr 23 13:42:34.549042 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.549004 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d"} err="failed to get container status \"7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d\": rpc error: code = NotFound desc = could not find container \"7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d\": container with ID starting with 7e58e0d67a20244cedb7eaf42599da79021910328113438f3dc6db199f52c18d not found: ID does not exist" Apr 23 13:42:34.549042 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.549016 2569 scope.go:117] "RemoveContainer" containerID="b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd" Apr 23 13:42:34.549248 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:42:34.549224 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd\": container with ID starting with b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd not found: ID does not exist" containerID="b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd" Apr 23 13:42:34.549291 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:34.549251 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd"} err="failed to get container status \"b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd\": rpc error: code = NotFound desc = could not find container \"b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd\": container with ID starting with b775b60152a520dcecfe48fb087e193f5d4e26b85042d6daafd420b029a30ffd not found: ID does not exist" Apr 23 13:42:36.421071 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:36.421021 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" path="/var/lib/kubelet/pods/c5c6f975-4af9-4a6c-b5c6-abe6fc720e00/volumes" Apr 23 13:42:36.433252 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:36.433216 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:42:44.500533 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:44.500442 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:42:46.433148 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:46.433096 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:42:54.500078 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:54.500023 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:42:56.432939 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:42:56.432897 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:43:04.500657 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:04.500621 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:43:06.433404 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:06.433358 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:43:14.500428 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:14.500388 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:43:16.433199 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:16.433160 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:43:24.500585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:24.500539 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:43:25.417134 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:25.417094 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:43:34.500357 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:34.500318 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:43:53.817479 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.817445 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r"] Apr 23 13:43:53.817997 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.817809 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" containerID="cri-o://6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841" gracePeriod=30 Apr 23 13:43:53.817997 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.817809 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kube-rbac-proxy" containerID="cri-o://06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a" gracePeriod=30 Apr 23 13:43:53.898197 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898160 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm"] Apr 23 13:43:53.898703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898682 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="storage-initializer" Apr 23 13:43:53.898703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898703 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="storage-initializer" Apr 23 13:43:53.898835 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898725 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" Apr 23 13:43:53.898835 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898735 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" Apr 23 13:43:53.898835 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898745 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" Apr 23 13:43:53.898835 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898754 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" Apr 23 13:43:53.898835 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898765 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" Apr 23 13:43:53.898835 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898773 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" Apr 23 13:43:53.899009 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898870 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kserve-container" Apr 23 13:43:53.899009 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898886 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="agent" Apr 23 13:43:53.899009 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.898898 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5c6f975-4af9-4a6c-b5c6-abe6fc720e00" containerName="kube-rbac-proxy" Apr 23 13:43:53.902599 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.902572 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:53.904930 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.904909 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-4f8db-predictor-serving-cert\"" Apr 23 13:43:53.905045 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.904955 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\"" Apr 23 13:43:53.913025 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.912989 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm"] Apr 23 13:43:53.957553 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.957516 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98"] Apr 23 13:43:53.957994 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.957942 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" containerID="cri-o://79b83fcfd71e081057f27c3fb99927c33cba1595fba96bbf8636034eee7d0c76" gracePeriod=30 Apr 23 13:43:53.958161 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.957995 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kube-rbac-proxy" containerID="cri-o://d3f678b5b5ee81e9abe350465a434114fa4581ed0409ad8521dba29817cb32ac" gracePeriod=30 Apr 23 13:43:53.979545 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.979515 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6"] Apr 23 13:43:53.982967 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.982938 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:53.985289 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.985266 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-4f8db-predictor-serving-cert\"" Apr 23 13:43:53.985411 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.985303 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\"" Apr 23 13:43:53.993287 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.993258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8601b18e-0763-4e85-be49-f777e7b19cb1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:53.993407 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.993377 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7x55\" (UniqueName: \"kubernetes.io/projected/8601b18e-0763-4e85-be49-f777e7b19cb1-kube-api-access-p7x55\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:53.993524 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.993420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8601b18e-0763-4e85-be49-f777e7b19cb1-isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:53.993524 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.993471 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8601b18e-0763-4e85-be49-f777e7b19cb1-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:53.995111 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:53.995087 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6"] Apr 23 13:43:54.094916 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.094875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b802602d-f877-40b1-9a89-e26ec27b66ca-isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.095123 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.094941 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7x55\" (UniqueName: \"kubernetes.io/projected/8601b18e-0763-4e85-be49-f777e7b19cb1-kube-api-access-p7x55\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.095123 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.094981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8601b18e-0763-4e85-be49-f777e7b19cb1-isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.095123 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.095023 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8601b18e-0763-4e85-be49-f777e7b19cb1-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.095123 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.095073 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b802602d-f877-40b1-9a89-e26ec27b66ca-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.095352 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.095126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b802602d-f877-40b1-9a89-e26ec27b66ca-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.095352 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.095151 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c2mr\" (UniqueName: \"kubernetes.io/projected/b802602d-f877-40b1-9a89-e26ec27b66ca-kube-api-access-6c2mr\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.095352 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.095205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8601b18e-0763-4e85-be49-f777e7b19cb1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.095352 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:43:54.095241 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-serving-cert: secret "isvc-sklearn-graph-raw-hpa-4f8db-predictor-serving-cert" not found Apr 23 13:43:54.095352 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:43:54.095315 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8601b18e-0763-4e85-be49-f777e7b19cb1-proxy-tls podName:8601b18e-0763-4e85-be49-f777e7b19cb1 nodeName:}" failed. No retries permitted until 2026-04-23 13:43:54.595293257 +0000 UTC m=+688.750215707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8601b18e-0763-4e85-be49-f777e7b19cb1-proxy-tls") pod "isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" (UID: "8601b18e-0763-4e85-be49-f777e7b19cb1") : secret "isvc-sklearn-graph-raw-hpa-4f8db-predictor-serving-cert" not found Apr 23 13:43:54.095634 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.095602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8601b18e-0763-4e85-be49-f777e7b19cb1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.096050 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.096027 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8601b18e-0763-4e85-be49-f777e7b19cb1-isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.103591 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.103559 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7x55\" (UniqueName: \"kubernetes.io/projected/8601b18e-0763-4e85-be49-f777e7b19cb1-kube-api-access-p7x55\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.196244 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.196202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b802602d-f877-40b1-9a89-e26ec27b66ca-isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.196439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.196273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b802602d-f877-40b1-9a89-e26ec27b66ca-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.196439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.196304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b802602d-f877-40b1-9a89-e26ec27b66ca-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.196439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.196321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c2mr\" (UniqueName: \"kubernetes.io/projected/b802602d-f877-40b1-9a89-e26ec27b66ca-kube-api-access-6c2mr\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.196764 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.196732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b802602d-f877-40b1-9a89-e26ec27b66ca-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.196937 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.196913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b802602d-f877-40b1-9a89-e26ec27b66ca-isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.198914 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.198890 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b802602d-f877-40b1-9a89-e26ec27b66ca-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.204471 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.204450 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c2mr\" (UniqueName: \"kubernetes.io/projected/b802602d-f877-40b1-9a89-e26ec27b66ca-kube-api-access-6c2mr\") pod \"isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.294959 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.294908 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:54.495463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.495417 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.35:8643/healthz\": dial tcp 10.134.0.35:8643: connect: connection refused" Apr 23 13:43:54.499744 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.499720 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:43:54.602726 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.602644 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8601b18e-0763-4e85-be49-f777e7b19cb1-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.605397 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.605373 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8601b18e-0763-4e85-be49-f777e7b19cb1-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.637249 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.637225 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6"] Apr 23 13:43:54.639378 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:43:54.639350 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb802602d_f877_40b1_9a89_e26ec27b66ca.slice/crio-31212cd4a84a46cacbe182dcf2eaf139b5969c289010bb98fc1028453c007378 WatchSource:0}: Error finding container 31212cd4a84a46cacbe182dcf2eaf139b5969c289010bb98fc1028453c007378: Status 404 returned error can't find the container with id 31212cd4a84a46cacbe182dcf2eaf139b5969c289010bb98fc1028453c007378 Apr 23 13:43:54.641173 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.641158 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:43:54.784711 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.784670 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" event={"ID":"b802602d-f877-40b1-9a89-e26ec27b66ca","Type":"ContainerStarted","Data":"fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b"} Apr 23 13:43:54.784711 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.784716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" event={"ID":"b802602d-f877-40b1-9a89-e26ec27b66ca","Type":"ContainerStarted","Data":"31212cd4a84a46cacbe182dcf2eaf139b5969c289010bb98fc1028453c007378"} Apr 23 13:43:54.786635 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.786608 2569 generic.go:358] "Generic (PLEG): container finished" podID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerID="06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a" exitCode=2 Apr 23 13:43:54.786751 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.786675 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" event={"ID":"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69","Type":"ContainerDied","Data":"06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a"} Apr 23 13:43:54.788531 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.788506 2569 generic.go:358] "Generic (PLEG): container finished" podID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerID="d3f678b5b5ee81e9abe350465a434114fa4581ed0409ad8521dba29817cb32ac" exitCode=2 Apr 23 13:43:54.788609 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.788572 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" event={"ID":"a9ba9358-4436-47f4-9437-5acc89fd41c9","Type":"ContainerDied","Data":"d3f678b5b5ee81e9abe350465a434114fa4581ed0409ad8521dba29817cb32ac"} Apr 23 13:43:54.815583 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.815547 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:54.946435 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:54.946403 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm"] Apr 23 13:43:54.948286 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:43:54.948240 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8601b18e_0763_4e85_be49_f777e7b19cb1.slice/crio-705b5adf511b874bd8e4b40a7aa0654deec7264948978f3b794eec4b3a35d0a0 WatchSource:0}: Error finding container 705b5adf511b874bd8e4b40a7aa0654deec7264948978f3b794eec4b3a35d0a0: Status 404 returned error can't find the container with id 705b5adf511b874bd8e4b40a7aa0654deec7264948978f3b794eec4b3a35d0a0 Apr 23 13:43:55.416469 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:55.416425 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 23 13:43:55.794075 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:55.794035 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" event={"ID":"8601b18e-0763-4e85-be49-f777e7b19cb1","Type":"ContainerStarted","Data":"93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029"} Apr 23 13:43:55.794281 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:55.794090 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" event={"ID":"8601b18e-0763-4e85-be49-f777e7b19cb1","Type":"ContainerStarted","Data":"705b5adf511b874bd8e4b40a7aa0654deec7264948978f3b794eec4b3a35d0a0"} Apr 23 13:43:56.427646 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:56.427603 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 23 13:43:57.805797 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.805756 2569 generic.go:358] "Generic (PLEG): container finished" podID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerID="79b83fcfd71e081057f27c3fb99927c33cba1595fba96bbf8636034eee7d0c76" exitCode=0 Apr 23 13:43:57.806246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.805834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" event={"ID":"a9ba9358-4436-47f4-9437-5acc89fd41c9","Type":"ContainerDied","Data":"79b83fcfd71e081057f27c3fb99927c33cba1595fba96bbf8636034eee7d0c76"} Apr 23 13:43:57.806246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.805884 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" event={"ID":"a9ba9358-4436-47f4-9437-5acc89fd41c9","Type":"ContainerDied","Data":"cd1a6ea879074c512079f05c70fff9951f9c7b205d5f502527db38daa7805716"} Apr 23 13:43:57.806246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.805898 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd1a6ea879074c512079f05c70fff9951f9c7b205d5f502527db38daa7805716" Apr 23 13:43:57.817572 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.817547 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:43:57.929883 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.929785 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9ba9358-4436-47f4-9437-5acc89fd41c9-proxy-tls\") pod \"a9ba9358-4436-47f4-9437-5acc89fd41c9\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " Apr 23 13:43:57.929883 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.929841 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htk77\" (UniqueName: \"kubernetes.io/projected/a9ba9358-4436-47f4-9437-5acc89fd41c9-kube-api-access-htk77\") pod \"a9ba9358-4436-47f4-9437-5acc89fd41c9\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " Apr 23 13:43:57.930155 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.929909 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9ba9358-4436-47f4-9437-5acc89fd41c9-kserve-provision-location\") pod \"a9ba9358-4436-47f4-9437-5acc89fd41c9\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " Apr 23 13:43:57.930155 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.929942 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9ba9358-4436-47f4-9437-5acc89fd41c9-isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\") pod \"a9ba9358-4436-47f4-9437-5acc89fd41c9\" (UID: \"a9ba9358-4436-47f4-9437-5acc89fd41c9\") " Apr 23 13:43:57.930309 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.930286 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9ba9358-4436-47f4-9437-5acc89fd41c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a9ba9358-4436-47f4-9437-5acc89fd41c9" (UID: "a9ba9358-4436-47f4-9437-5acc89fd41c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:43:57.930367 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.930345 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9ba9358-4436-47f4-9437-5acc89fd41c9-isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config") pod "a9ba9358-4436-47f4-9437-5acc89fd41c9" (UID: "a9ba9358-4436-47f4-9437-5acc89fd41c9"). InnerVolumeSpecName "isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:43:57.932235 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.932204 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba9358-4436-47f4-9437-5acc89fd41c9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a9ba9358-4436-47f4-9437-5acc89fd41c9" (UID: "a9ba9358-4436-47f4-9437-5acc89fd41c9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:43:57.932328 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:57.932251 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ba9358-4436-47f4-9437-5acc89fd41c9-kube-api-access-htk77" (OuterVolumeSpecName: "kube-api-access-htk77") pod "a9ba9358-4436-47f4-9437-5acc89fd41c9" (UID: "a9ba9358-4436-47f4-9437-5acc89fd41c9"). InnerVolumeSpecName "kube-api-access-htk77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:43:58.031316 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.031257 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9ba9358-4436-47f4-9437-5acc89fd41c9-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:43:58.031316 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.031311 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9ba9358-4436-47f4-9437-5acc89fd41c9-isvc-xgboost-graph-raw-b90a6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:43:58.031316 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.031325 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9ba9358-4436-47f4-9437-5acc89fd41c9-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:43:58.031551 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.031339 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-htk77\" (UniqueName: \"kubernetes.io/projected/a9ba9358-4436-47f4-9437-5acc89fd41c9-kube-api-access-htk77\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:43:58.356086 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.356038 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:43:58.434414 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.434372 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-proxy-tls\") pod \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " Apr 23 13:43:58.434592 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.434462 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kserve-provision-location\") pod \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " Apr 23 13:43:58.434592 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.434516 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b4n8\" (UniqueName: \"kubernetes.io/projected/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kube-api-access-5b4n8\") pod \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " Apr 23 13:43:58.434592 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.434556 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\") pod \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\" (UID: \"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69\") " Apr 23 13:43:58.434933 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.434912 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" (UID: "4f8f8aad-20d9-48c2-ab5d-741e0d08fa69"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:43:58.435087 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.435015 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config") pod "4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" (UID: "4f8f8aad-20d9-48c2-ab5d-741e0d08fa69"). InnerVolumeSpecName "isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:43:58.437402 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.437338 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" (UID: "4f8f8aad-20d9-48c2-ab5d-741e0d08fa69"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:43:58.437402 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.437359 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kube-api-access-5b4n8" (OuterVolumeSpecName: "kube-api-access-5b4n8") pod "4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" (UID: "4f8f8aad-20d9-48c2-ab5d-741e0d08fa69"). InnerVolumeSpecName "kube-api-access-5b4n8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:43:58.535498 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.535454 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5b4n8\" (UniqueName: \"kubernetes.io/projected/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kube-api-access-5b4n8\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:43:58.535498 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.535495 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-isvc-sklearn-graph-raw-b90a6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:43:58.535498 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.535507 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:43:58.535739 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.535518 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:43:58.811002 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.810956 2569 generic.go:358] "Generic (PLEG): container finished" podID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerID="6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841" exitCode=0 Apr 23 13:43:58.811442 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.811049 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" event={"ID":"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69","Type":"ContainerDied","Data":"6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841"} Apr 23 13:43:58.811442 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.811105 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" event={"ID":"4f8f8aad-20d9-48c2-ab5d-741e0d08fa69","Type":"ContainerDied","Data":"e82b3626d04f1b631e3f6b19703eb7b51d1cebfe9cda32d8e1fe1df705d39a8a"} Apr 23 13:43:58.811442 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.811126 2569 scope.go:117] "RemoveContainer" containerID="06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a" Apr 23 13:43:58.811442 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.811158 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r" Apr 23 13:43:58.812548 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.812522 2569 generic.go:358] "Generic (PLEG): container finished" podID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerID="93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029" exitCode=0 Apr 23 13:43:58.812732 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.812600 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" event={"ID":"8601b18e-0763-4e85-be49-f777e7b19cb1","Type":"ContainerDied","Data":"93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029"} Apr 23 13:43:58.814205 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.814182 2569 generic.go:358] "Generic (PLEG): container finished" podID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerID="fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b" exitCode=0 Apr 23 13:43:58.814319 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.814202 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" event={"ID":"b802602d-f877-40b1-9a89-e26ec27b66ca","Type":"ContainerDied","Data":"fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b"} Apr 23 13:43:58.814383 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.814347 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98" Apr 23 13:43:58.821672 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.821655 2569 scope.go:117] "RemoveContainer" containerID="6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841" Apr 23 13:43:58.833780 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.833752 2569 scope.go:117] "RemoveContainer" containerID="b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e" Apr 23 13:43:58.851007 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.850982 2569 scope.go:117] "RemoveContainer" containerID="06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a" Apr 23 13:43:58.851480 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:43:58.851459 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a\": container with ID starting with 06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a not found: ID does not exist" containerID="06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a" Apr 23 13:43:58.851578 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.851492 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a"} err="failed to get container status \"06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a\": rpc error: code = NotFound desc = could not find container \"06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a\": container with ID starting with 06b7ec3f4b3d703740f31c2b3b996809d73d6ea14e0e906263c2b72fc3a6ba0a not found: ID does not exist" Apr 23 13:43:58.851578 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.851518 2569 scope.go:117] "RemoveContainer" containerID="6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841" Apr 23 13:43:58.851859 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:43:58.851841 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841\": container with ID starting with 6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841 not found: ID does not exist" containerID="6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841" Apr 23 13:43:58.851921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.851865 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841"} err="failed to get container status \"6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841\": rpc error: code = NotFound desc = could not find container \"6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841\": container with ID starting with 6249095f70af6d3a8139eb4ab28f954a8c3c0e259e57ed5d9d8edcd4453c8841 not found: ID does not exist" Apr 23 13:43:58.851921 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.851882 2569 scope.go:117] "RemoveContainer" containerID="b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e" Apr 23 13:43:58.852171 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:43:58.852147 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e\": container with ID starting with b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e not found: ID does not exist" containerID="b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e" Apr 23 13:43:58.852250 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.852181 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e"} err="failed to get container status \"b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e\": rpc error: code = NotFound desc = could not find container \"b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e\": container with ID starting with b58144dfc5133e4cb9cc0ebab5123caa04389d19926c1e34affd81439a08d52e not found: ID does not exist" Apr 23 13:43:58.867680 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.867649 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98"] Apr 23 13:43:58.871831 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.871804 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b90a6-predictor-6bbb968789-8tb98"] Apr 23 13:43:58.881718 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.881695 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r"] Apr 23 13:43:58.890287 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:58.890262 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b90a6-predictor-67c897bfbd-ffg7r"] Apr 23 13:43:59.818979 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:59.818940 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" event={"ID":"8601b18e-0763-4e85-be49-f777e7b19cb1","Type":"ContainerStarted","Data":"71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb"} Apr 23 13:43:59.818979 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:59.818985 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" event={"ID":"8601b18e-0763-4e85-be49-f777e7b19cb1","Type":"ContainerStarted","Data":"0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6"} Apr 23 13:43:59.819557 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:59.819206 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:43:59.820816 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:59.820796 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" event={"ID":"b802602d-f877-40b1-9a89-e26ec27b66ca","Type":"ContainerStarted","Data":"cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7"} Apr 23 13:43:59.820925 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:59.820822 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" event={"ID":"b802602d-f877-40b1-9a89-e26ec27b66ca","Type":"ContainerStarted","Data":"4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070"} Apr 23 13:43:59.821095 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:59.821078 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:43:59.839492 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:59.839444 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podStartSLOduration=6.839431069 podStartE2EDuration="6.839431069s" podCreationTimestamp="2026-04-23 13:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:43:59.837435368 +0000 UTC m=+693.992357836" watchObservedRunningTime="2026-04-23 13:43:59.839431069 +0000 UTC m=+693.994353536" Apr 23 13:43:59.856653 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:43:59.856599 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podStartSLOduration=6.856584733 podStartE2EDuration="6.856584733s" podCreationTimestamp="2026-04-23 13:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:43:59.855937654 +0000 UTC m=+694.010860124" watchObservedRunningTime="2026-04-23 13:43:59.856584733 +0000 UTC m=+694.011507200" Apr 23 13:44:00.424899 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:00.424865 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" path="/var/lib/kubelet/pods/4f8f8aad-20d9-48c2-ab5d-741e0d08fa69/volumes" Apr 23 13:44:00.425402 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:00.425385 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" path="/var/lib/kubelet/pods/a9ba9358-4436-47f4-9437-5acc89fd41c9/volumes" Apr 23 13:44:00.826411 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:00.826370 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:44:00.826411 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:00.826412 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:44:00.827351 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:00.827320 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 23 13:44:00.827438 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:00.827320 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 23 13:44:01.829472 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:01.829430 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 23 13:44:01.829854 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:01.829430 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 23 13:44:06.833656 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:06.833628 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:44:06.834083 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:06.833699 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:44:06.834360 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:06.834334 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 23 13:44:06.834472 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:06.834441 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 23 13:44:16.834313 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:16.834212 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 23 13:44:16.834862 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:16.834319 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 23 13:44:26.834604 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:26.834563 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 23 13:44:26.835048 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:26.834563 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 23 13:44:36.834241 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:36.834196 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 23 13:44:36.834241 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:36.834223 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 23 13:44:46.835204 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:46.835162 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 23 13:44:46.835578 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:46.835162 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 23 13:44:56.834589 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:56.834544 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 23 13:44:56.835104 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:44:56.834544 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 23 13:45:06.834996 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:06.834966 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:45:06.835441 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:06.835209 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:45:34.234387 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.234355 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm"] Apr 23 13:45:34.234888 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.234665 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" containerID="cri-o://0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6" gracePeriod=30 Apr 23 13:45:34.234888 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.234716 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kube-rbac-proxy" containerID="cri-o://71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb" gracePeriod=30 Apr 23 13:45:34.316215 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316181 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd"] Apr 23 13:45:34.316585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316567 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kube-rbac-proxy" Apr 23 13:45:34.316667 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316588 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kube-rbac-proxy" Apr 23 13:45:34.316667 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316613 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kube-rbac-proxy" Apr 23 13:45:34.316667 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316622 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kube-rbac-proxy" Apr 23 13:45:34.316667 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316648 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="storage-initializer" Apr 23 13:45:34.316667 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316658 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="storage-initializer" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316670 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316678 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316688 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316697 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316709 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="storage-initializer" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316717 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="storage-initializer" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316802 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kube-rbac-proxy" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316815 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9ba9358-4436-47f4-9437-5acc89fd41c9" containerName="kserve-container" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316827 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kserve-container" Apr 23 13:45:34.316907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.316838 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f8f8aad-20d9-48c2-ab5d-741e0d08fa69" containerName="kube-rbac-proxy" Apr 23 13:45:34.320807 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.320784 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.324660 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.324431 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-35dde-predictor-serving-cert\"" Apr 23 13:45:34.325038 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.325013 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-35dde-kube-rbac-proxy-sar-config\"" Apr 23 13:45:34.335312 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.335287 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd"] Apr 23 13:45:34.346238 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.346205 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/00dc171a-3be5-4310-91e3-38e0a4724108-message-dumper-raw-35dde-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-35dde-predictor-5776cbf779-g8nmd\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.346238 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.346241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlr4w\" (UniqueName: \"kubernetes.io/projected/00dc171a-3be5-4310-91e3-38e0a4724108-kube-api-access-nlr4w\") pod \"message-dumper-raw-35dde-predictor-5776cbf779-g8nmd\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.346473 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.346341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00dc171a-3be5-4310-91e3-38e0a4724108-proxy-tls\") pod \"message-dumper-raw-35dde-predictor-5776cbf779-g8nmd\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.376172 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.376124 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6"] Apr 23 13:45:34.376643 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.376541 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" containerID="cri-o://4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070" gracePeriod=30 Apr 23 13:45:34.376779 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.376741 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kube-rbac-proxy" containerID="cri-o://cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7" gracePeriod=30 Apr 23 13:45:34.447138 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.447054 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/00dc171a-3be5-4310-91e3-38e0a4724108-message-dumper-raw-35dde-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-35dde-predictor-5776cbf779-g8nmd\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.447138 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.447145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlr4w\" (UniqueName: \"kubernetes.io/projected/00dc171a-3be5-4310-91e3-38e0a4724108-kube-api-access-nlr4w\") pod \"message-dumper-raw-35dde-predictor-5776cbf779-g8nmd\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.447394 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.447190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00dc171a-3be5-4310-91e3-38e0a4724108-proxy-tls\") pod \"message-dumper-raw-35dde-predictor-5776cbf779-g8nmd\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.447782 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.447762 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/00dc171a-3be5-4310-91e3-38e0a4724108-message-dumper-raw-35dde-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-35dde-predictor-5776cbf779-g8nmd\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.449760 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.449743 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00dc171a-3be5-4310-91e3-38e0a4724108-proxy-tls\") pod \"message-dumper-raw-35dde-predictor-5776cbf779-g8nmd\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.457345 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.457317 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlr4w\" (UniqueName: \"kubernetes.io/projected/00dc171a-3be5-4310-91e3-38e0a4724108-kube-api-access-nlr4w\") pod \"message-dumper-raw-35dde-predictor-5776cbf779-g8nmd\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.636925 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.636885 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:34.765337 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:34.765313 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd"] Apr 23 13:45:34.766828 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:45:34.766806 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00dc171a_3be5_4310_91e3_38e0a4724108.slice/crio-42ef45a5fe4a42dcb9a836ae8ec05d171e728c9a94150471becfc45a160a2c29 WatchSource:0}: Error finding container 42ef45a5fe4a42dcb9a836ae8ec05d171e728c9a94150471becfc45a160a2c29: Status 404 returned error can't find the container with id 42ef45a5fe4a42dcb9a836ae8ec05d171e728c9a94150471becfc45a160a2c29 Apr 23 13:45:35.152550 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:35.152452 2569 generic.go:358] "Generic (PLEG): container finished" podID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerID="cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7" exitCode=2 Apr 23 13:45:35.152550 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:35.152525 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" event={"ID":"b802602d-f877-40b1-9a89-e26ec27b66ca","Type":"ContainerDied","Data":"cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7"} Apr 23 13:45:35.153551 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:35.153528 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" event={"ID":"00dc171a-3be5-4310-91e3-38e0a4724108","Type":"ContainerStarted","Data":"42ef45a5fe4a42dcb9a836ae8ec05d171e728c9a94150471becfc45a160a2c29"} Apr 23 13:45:35.155177 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:35.155156 2569 generic.go:358] "Generic (PLEG): container finished" podID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerID="71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb" exitCode=2 Apr 23 13:45:35.155249 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:35.155207 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" event={"ID":"8601b18e-0763-4e85-be49-f777e7b19cb1","Type":"ContainerDied","Data":"71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb"} Apr 23 13:45:36.159911 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.159818 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" event={"ID":"00dc171a-3be5-4310-91e3-38e0a4724108","Type":"ContainerStarted","Data":"f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1"} Apr 23 13:45:36.159911 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.159860 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" event={"ID":"00dc171a-3be5-4310-91e3-38e0a4724108","Type":"ContainerStarted","Data":"ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa"} Apr 23 13:45:36.160403 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.160087 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:36.160403 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.160119 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:36.161989 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.161968 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:36.183668 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.183616 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" podStartSLOduration=1.142649056 podStartE2EDuration="2.18359842s" podCreationTimestamp="2026-04-23 13:45:34 +0000 UTC" firstStartedPulling="2026-04-23 13:45:34.768564615 +0000 UTC m=+788.923487061" lastFinishedPulling="2026-04-23 13:45:35.809513975 +0000 UTC m=+789.964436425" observedRunningTime="2026-04-23 13:45:36.18264001 +0000 UTC m=+790.337562472" watchObservedRunningTime="2026-04-23 13:45:36.18359842 +0000 UTC m=+790.338520888" Apr 23 13:45:36.830164 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.830114 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 23 13:45:36.830343 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.830114 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 23 13:45:36.834854 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.834813 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 23 13:45:36.834854 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:36.834822 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 23 13:45:38.119553 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.119529 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:45:38.168361 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.168321 2569 generic.go:358] "Generic (PLEG): container finished" podID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerID="4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070" exitCode=0 Apr 23 13:45:38.168533 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.168401 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" Apr 23 13:45:38.168533 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.168400 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" event={"ID":"b802602d-f877-40b1-9a89-e26ec27b66ca","Type":"ContainerDied","Data":"4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070"} Apr 23 13:45:38.168533 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.168443 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6" event={"ID":"b802602d-f877-40b1-9a89-e26ec27b66ca","Type":"ContainerDied","Data":"31212cd4a84a46cacbe182dcf2eaf139b5969c289010bb98fc1028453c007378"} Apr 23 13:45:38.168533 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.168465 2569 scope.go:117] "RemoveContainer" containerID="cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7" Apr 23 13:45:38.174225 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.174205 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c2mr\" (UniqueName: \"kubernetes.io/projected/b802602d-f877-40b1-9a89-e26ec27b66ca-kube-api-access-6c2mr\") pod \"b802602d-f877-40b1-9a89-e26ec27b66ca\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " Apr 23 13:45:38.176509 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.176478 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b802602d-f877-40b1-9a89-e26ec27b66ca-kube-api-access-6c2mr" (OuterVolumeSpecName: "kube-api-access-6c2mr") pod "b802602d-f877-40b1-9a89-e26ec27b66ca" (UID: "b802602d-f877-40b1-9a89-e26ec27b66ca"). InnerVolumeSpecName "kube-api-access-6c2mr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:45:38.177547 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.177526 2569 scope.go:117] "RemoveContainer" containerID="4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070" Apr 23 13:45:38.187619 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.187603 2569 scope.go:117] "RemoveContainer" containerID="fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b" Apr 23 13:45:38.194694 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.194675 2569 scope.go:117] "RemoveContainer" containerID="cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7" Apr 23 13:45:38.194952 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:45:38.194932 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7\": container with ID starting with cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7 not found: ID does not exist" containerID="cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7" Apr 23 13:45:38.194996 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.194962 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7"} err="failed to get container status \"cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7\": rpc error: code = NotFound desc = could not find container \"cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7\": container with ID starting with cd65c72ac0b371c7095635c2937a58728ebe74d196f95b7affade7d76e0e33c7 not found: ID does not exist" Apr 23 13:45:38.194996 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.194981 2569 scope.go:117] "RemoveContainer" containerID="4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070" Apr 23 13:45:38.195210 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:45:38.195192 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070\": container with ID starting with 4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070 not found: ID does not exist" containerID="4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070" Apr 23 13:45:38.195256 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.195217 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070"} err="failed to get container status \"4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070\": rpc error: code = NotFound desc = could not find container \"4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070\": container with ID starting with 4af33a8bbb2aed69a6a97e66a6f112fe4b35f8f94ec5d94cc745b65e1b19f070 not found: ID does not exist" Apr 23 13:45:38.195256 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.195234 2569 scope.go:117] "RemoveContainer" containerID="fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b" Apr 23 13:45:38.195449 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:45:38.195428 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b\": container with ID starting with fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b not found: ID does not exist" containerID="fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b" Apr 23 13:45:38.195494 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.195452 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b"} err="failed to get container status \"fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b\": rpc error: code = NotFound desc = could not find container \"fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b\": container with ID starting with fe8269697a9ec95d7fd967d046e911560227239dbd7d29fed571e19f8dd6529b not found: ID does not exist" Apr 23 13:45:38.274855 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.274819 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b802602d-f877-40b1-9a89-e26ec27b66ca-isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") pod \"b802602d-f877-40b1-9a89-e26ec27b66ca\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " Apr 23 13:45:38.275018 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.274867 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b802602d-f877-40b1-9a89-e26ec27b66ca-proxy-tls\") pod \"b802602d-f877-40b1-9a89-e26ec27b66ca\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " Apr 23 13:45:38.275018 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.274903 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b802602d-f877-40b1-9a89-e26ec27b66ca-kserve-provision-location\") pod \"b802602d-f877-40b1-9a89-e26ec27b66ca\" (UID: \"b802602d-f877-40b1-9a89-e26ec27b66ca\") " Apr 23 13:45:38.275150 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.275108 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6c2mr\" (UniqueName: \"kubernetes.io/projected/b802602d-f877-40b1-9a89-e26ec27b66ca-kube-api-access-6c2mr\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:45:38.275337 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.275306 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b802602d-f877-40b1-9a89-e26ec27b66ca-isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config") pod "b802602d-f877-40b1-9a89-e26ec27b66ca" (UID: "b802602d-f877-40b1-9a89-e26ec27b66ca"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:45:38.275404 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.275317 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b802602d-f877-40b1-9a89-e26ec27b66ca-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b802602d-f877-40b1-9a89-e26ec27b66ca" (UID: "b802602d-f877-40b1-9a89-e26ec27b66ca"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:38.277166 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.277143 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b802602d-f877-40b1-9a89-e26ec27b66ca-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b802602d-f877-40b1-9a89-e26ec27b66ca" (UID: "b802602d-f877-40b1-9a89-e26ec27b66ca"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:45:38.375780 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.375740 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b802602d-f877-40b1-9a89-e26ec27b66ca-isvc-xgboost-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:45:38.375780 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.375773 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b802602d-f877-40b1-9a89-e26ec27b66ca-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:45:38.375780 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.375783 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b802602d-f877-40b1-9a89-e26ec27b66ca-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:45:38.488896 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.488847 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6"] Apr 23 13:45:38.492795 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.492762 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4f8db-predictor-bdfb6b865-b6wz6"] Apr 23 13:45:38.601148 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:45:38.601103 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8601b18e_0763_4e85_be49_f777e7b19cb1.slice/crio-0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8601b18e_0763_4e85_be49_f777e7b19cb1.slice/crio-conmon-0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6.scope\": RecentStats: unable to find data in memory cache]" Apr 23 13:45:38.773828 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.773806 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:45:38.777619 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.777595 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8601b18e-0763-4e85-be49-f777e7b19cb1-isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") pod \"8601b18e-0763-4e85-be49-f777e7b19cb1\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " Apr 23 13:45:38.777752 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.777649 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8601b18e-0763-4e85-be49-f777e7b19cb1-proxy-tls\") pod \"8601b18e-0763-4e85-be49-f777e7b19cb1\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " Apr 23 13:45:38.777752 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.777681 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8601b18e-0763-4e85-be49-f777e7b19cb1-kserve-provision-location\") pod \"8601b18e-0763-4e85-be49-f777e7b19cb1\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " Apr 23 13:45:38.777752 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.777716 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7x55\" (UniqueName: \"kubernetes.io/projected/8601b18e-0763-4e85-be49-f777e7b19cb1-kube-api-access-p7x55\") pod \"8601b18e-0763-4e85-be49-f777e7b19cb1\" (UID: \"8601b18e-0763-4e85-be49-f777e7b19cb1\") " Apr 23 13:45:38.778016 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.777980 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8601b18e-0763-4e85-be49-f777e7b19cb1-isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config") pod "8601b18e-0763-4e85-be49-f777e7b19cb1" (UID: "8601b18e-0763-4e85-be49-f777e7b19cb1"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:45:38.778168 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.778017 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8601b18e-0763-4e85-be49-f777e7b19cb1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8601b18e-0763-4e85-be49-f777e7b19cb1" (UID: "8601b18e-0763-4e85-be49-f777e7b19cb1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:38.779999 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.779976 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8601b18e-0763-4e85-be49-f777e7b19cb1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8601b18e-0763-4e85-be49-f777e7b19cb1" (UID: "8601b18e-0763-4e85-be49-f777e7b19cb1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:45:38.780096 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.779976 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8601b18e-0763-4e85-be49-f777e7b19cb1-kube-api-access-p7x55" (OuterVolumeSpecName: "kube-api-access-p7x55") pod "8601b18e-0763-4e85-be49-f777e7b19cb1" (UID: "8601b18e-0763-4e85-be49-f777e7b19cb1"). InnerVolumeSpecName "kube-api-access-p7x55". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:45:38.878573 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.878482 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8601b18e-0763-4e85-be49-f777e7b19cb1-isvc-sklearn-graph-raw-hpa-4f8db-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:45:38.878573 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.878515 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8601b18e-0763-4e85-be49-f777e7b19cb1-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:45:38.878573 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.878527 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8601b18e-0763-4e85-be49-f777e7b19cb1-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:45:38.878573 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:38.878536 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7x55\" (UniqueName: \"kubernetes.io/projected/8601b18e-0763-4e85-be49-f777e7b19cb1-kube-api-access-p7x55\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:45:39.174537 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.174444 2569 generic.go:358] "Generic (PLEG): container finished" podID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerID="0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6" exitCode=0 Apr 23 13:45:39.174537 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.174525 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" Apr 23 13:45:39.175019 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.174530 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" event={"ID":"8601b18e-0763-4e85-be49-f777e7b19cb1","Type":"ContainerDied","Data":"0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6"} Apr 23 13:45:39.175019 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.174574 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm" event={"ID":"8601b18e-0763-4e85-be49-f777e7b19cb1","Type":"ContainerDied","Data":"705b5adf511b874bd8e4b40a7aa0654deec7264948978f3b794eec4b3a35d0a0"} Apr 23 13:45:39.175019 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.174595 2569 scope.go:117] "RemoveContainer" containerID="71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb" Apr 23 13:45:39.183120 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.183101 2569 scope.go:117] "RemoveContainer" containerID="0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6" Apr 23 13:45:39.190885 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.190864 2569 scope.go:117] "RemoveContainer" containerID="93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029" Apr 23 13:45:39.197269 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.197241 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm"] Apr 23 13:45:39.200126 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.200105 2569 scope.go:117] "RemoveContainer" containerID="71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb" Apr 23 13:45:39.200258 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.200239 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4f8db-predictor-75858f568-f2srm"] Apr 23 13:45:39.200433 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:45:39.200400 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb\": container with ID starting with 71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb not found: ID does not exist" containerID="71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb" Apr 23 13:45:39.200529 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.200438 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb"} err="failed to get container status \"71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb\": rpc error: code = NotFound desc = could not find container \"71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb\": container with ID starting with 71e7198efcd506fb04712cf4ac95ca023651b8b060251dc05669cf19c3dadafb not found: ID does not exist" Apr 23 13:45:39.200529 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.200458 2569 scope.go:117] "RemoveContainer" containerID="0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6" Apr 23 13:45:39.200724 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:45:39.200708 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6\": container with ID starting with 0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6 not found: ID does not exist" containerID="0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6" Apr 23 13:45:39.200790 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.200733 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6"} err="failed to get container status \"0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6\": rpc error: code = NotFound desc = could not find container \"0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6\": container with ID starting with 0e405274b90d8e91da3b0c82892fb543e15f51b7a8c86b1ec9ceb27c67bab6d6 not found: ID does not exist" Apr 23 13:45:39.200790 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.200756 2569 scope.go:117] "RemoveContainer" containerID="93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029" Apr 23 13:45:39.200992 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:45:39.200970 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029\": container with ID starting with 93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029 not found: ID does not exist" containerID="93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029" Apr 23 13:45:39.201035 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:39.200998 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029"} err="failed to get container status \"93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029\": rpc error: code = NotFound desc = could not find container \"93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029\": container with ID starting with 93ea627f0e9ed403bbab7978058e59a55d10b6ef83cb624f173839aa6d66a029 not found: ID does not exist" Apr 23 13:45:40.420660 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:40.420617 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" path="/var/lib/kubelet/pods/8601b18e-0763-4e85-be49-f777e7b19cb1/volumes" Apr 23 13:45:40.421361 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:40.421340 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" path="/var/lib/kubelet/pods/b802602d-f877-40b1-9a89-e26ec27b66ca/volumes" Apr 23 13:45:43.174207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:43.174123 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:45:44.244270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244232 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl"] Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244576 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kube-rbac-proxy" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244589 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kube-rbac-proxy" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244601 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="storage-initializer" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244607 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="storage-initializer" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244614 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="storage-initializer" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244620 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="storage-initializer" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244626 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244632 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244637 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244643 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244655 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kube-rbac-proxy" Apr 23 13:45:44.244661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244660 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kube-rbac-proxy" Apr 23 13:45:44.245109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244708 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kserve-container" Apr 23 13:45:44.245109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244717 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b802602d-f877-40b1-9a89-e26ec27b66ca" containerName="kube-rbac-proxy" Apr 23 13:45:44.245109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244723 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kserve-container" Apr 23 13:45:44.245109 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.244730 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8601b18e-0763-4e85-be49-f777e7b19cb1" containerName="kube-rbac-proxy" Apr 23 13:45:44.248033 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.248015 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.250304 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.250281 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\"" Apr 23 13:45:44.250408 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.250345 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-35dde-predictor-serving-cert\"" Apr 23 13:45:44.257409 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.257386 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl"] Apr 23 13:45:44.318583 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.318550 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgn9v\" (UniqueName: \"kubernetes.io/projected/e9d0c878-c391-4fa0-9049-42aecf64aa28-kube-api-access-kgn9v\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.318764 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.318600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e9d0c878-c391-4fa0-9049-42aecf64aa28-isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.318764 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.318645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9d0c878-c391-4fa0-9049-42aecf64aa28-proxy-tls\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.318764 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.318693 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d0c878-c391-4fa0-9049-42aecf64aa28-kserve-provision-location\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.419808 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.419770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgn9v\" (UniqueName: \"kubernetes.io/projected/e9d0c878-c391-4fa0-9049-42aecf64aa28-kube-api-access-kgn9v\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.420001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.419831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e9d0c878-c391-4fa0-9049-42aecf64aa28-isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.420001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.419855 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9d0c878-c391-4fa0-9049-42aecf64aa28-proxy-tls\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.420001 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.419895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d0c878-c391-4fa0-9049-42aecf64aa28-kserve-provision-location\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.420196 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:45:44.420022 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-serving-cert: secret "isvc-logger-raw-35dde-predictor-serving-cert" not found Apr 23 13:45:44.420196 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:45:44.420122 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9d0c878-c391-4fa0-9049-42aecf64aa28-proxy-tls podName:e9d0c878-c391-4fa0-9049-42aecf64aa28 nodeName:}" failed. No retries permitted until 2026-04-23 13:45:44.920099619 +0000 UTC m=+799.075022079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e9d0c878-c391-4fa0-9049-42aecf64aa28-proxy-tls") pod "isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" (UID: "e9d0c878-c391-4fa0-9049-42aecf64aa28") : secret "isvc-logger-raw-35dde-predictor-serving-cert" not found Apr 23 13:45:44.420336 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.420313 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d0c878-c391-4fa0-9049-42aecf64aa28-kserve-provision-location\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.420614 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.420595 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e9d0c878-c391-4fa0-9049-42aecf64aa28-isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.428310 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.428278 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgn9v\" (UniqueName: \"kubernetes.io/projected/e9d0c878-c391-4fa0-9049-42aecf64aa28-kube-api-access-kgn9v\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.924120 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.924042 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9d0c878-c391-4fa0-9049-42aecf64aa28-proxy-tls\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:44.926751 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:44.926727 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9d0c878-c391-4fa0-9049-42aecf64aa28-proxy-tls\") pod \"isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:45.160474 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:45.160434 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:45.292164 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:45.292138 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl"] Apr 23 13:45:45.294660 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:45:45.294630 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d0c878_c391_4fa0_9049_42aecf64aa28.slice/crio-eaa86f4fa0a79c8867c9b9f4b526296f8770c2e162a415424a3d9b40e161742a WatchSource:0}: Error finding container eaa86f4fa0a79c8867c9b9f4b526296f8770c2e162a415424a3d9b40e161742a: Status 404 returned error can't find the container with id eaa86f4fa0a79c8867c9b9f4b526296f8770c2e162a415424a3d9b40e161742a Apr 23 13:45:46.203271 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:46.203228 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerStarted","Data":"d6fd91fe2bc6762f257f6019e8969412720a2ab099d17b1740e408252a22f085"} Apr 23 13:45:46.203271 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:46.203275 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerStarted","Data":"eaa86f4fa0a79c8867c9b9f4b526296f8770c2e162a415424a3d9b40e161742a"} Apr 23 13:45:49.215478 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:49.215384 2569 generic.go:358] "Generic (PLEG): container finished" podID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerID="d6fd91fe2bc6762f257f6019e8969412720a2ab099d17b1740e408252a22f085" exitCode=0 Apr 23 13:45:49.215478 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:49.215461 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerDied","Data":"d6fd91fe2bc6762f257f6019e8969412720a2ab099d17b1740e408252a22f085"} Apr 23 13:45:50.220771 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:50.220737 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerStarted","Data":"9af64fa6340a703964d25525ca6a6de37d272415625ba099f15ef9aae402d30c"} Apr 23 13:45:50.220771 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:50.220772 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerStarted","Data":"6ad3cf977b2111333c670ecc392bad61a42b100601855da80fe179dcd35f90f1"} Apr 23 13:45:50.221201 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:50.220782 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerStarted","Data":"7814d97ce6becd4255339fab13d7509d41e8bc4cd5b54d3591a7f26a22601722"} Apr 23 13:45:50.221201 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:50.221091 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:50.221201 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:50.221122 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:50.221201 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:50.221135 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:50.222759 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:50.222731 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:45:50.223967 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:50.223896 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:45:50.243926 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:50.243877 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podStartSLOduration=6.2438646030000005 podStartE2EDuration="6.243864603s" podCreationTimestamp="2026-04-23 13:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:45:50.24307274 +0000 UTC m=+804.397995204" watchObservedRunningTime="2026-04-23 13:45:50.243864603 +0000 UTC m=+804.398787123" Apr 23 13:45:51.224831 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:51.224787 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:45:51.225366 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:51.225339 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:45:56.229229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:56.229194 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:45:56.229845 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:56.229802 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:45:56.230135 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:45:56.230099 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:46:06.229843 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:06.229803 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:46:06.230299 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:06.230275 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:46:16.230439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:16.230385 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:46:16.230928 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:16.230763 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:46:26.230003 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:26.229949 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:46:26.230539 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:26.230517 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:46:36.230151 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:36.230094 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:46:36.230674 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:36.230584 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:46:46.229901 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:46.229850 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:46:46.230378 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:46.230355 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:46:56.230717 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:56.230684 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:46:56.231260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:46:56.230940 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:47:09.316943 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.316905 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-35dde-predictor-5776cbf779-g8nmd_00dc171a-3be5-4310-91e3-38e0a4724108/kserve-container/0.log" Apr 23 13:47:09.482850 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.482815 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl"] Apr 23 13:47:09.483572 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.483533 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" containerID="cri-o://9af64fa6340a703964d25525ca6a6de37d272415625ba099f15ef9aae402d30c" gracePeriod=30 Apr 23 13:47:09.483859 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.483528 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" containerID="cri-o://7814d97ce6becd4255339fab13d7509d41e8bc4cd5b54d3591a7f26a22601722" gracePeriod=30 Apr 23 13:47:09.483859 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.483603 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" containerID="cri-o://6ad3cf977b2111333c670ecc392bad61a42b100601855da80fe179dcd35f90f1" gracePeriod=30 Apr 23 13:47:09.563425 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.561323 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd"] Apr 23 13:47:09.565935 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.565908 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.568803 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.568732 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-7affb-predictor-serving-cert\"" Apr 23 13:47:09.569084 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.568757 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\"" Apr 23 13:47:09.569748 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.569537 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd"] Apr 23 13:47:09.569903 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.569869 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" podUID="00dc171a-3be5-4310-91e3-38e0a4724108" containerName="kserve-container" containerID="cri-o://ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa" gracePeriod=30 Apr 23 13:47:09.571534 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.570647 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" podUID="00dc171a-3be5-4310-91e3-38e0a4724108" containerName="kube-rbac-proxy" containerID="cri-o://f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1" gracePeriod=30 Apr 23 13:47:09.573585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.573560 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd"] Apr 23 13:47:09.611363 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.611293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ppjc\" (UniqueName: \"kubernetes.io/projected/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kube-api-access-4ppjc\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.611363 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.611365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.611560 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.611394 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.611560 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.611475 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.712406 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.712368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ppjc\" (UniqueName: \"kubernetes.io/projected/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kube-api-access-4ppjc\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.712592 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.712429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.712654 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.712598 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.712715 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.712660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.712766 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:47:09.712728 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-serving-cert: secret "isvc-sklearn-scale-raw-7affb-predictor-serving-cert" not found Apr 23 13:47:09.712815 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:47:09.712798 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls podName:9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069 nodeName:}" failed. No retries permitted until 2026-04-23 13:47:10.21277702 +0000 UTC m=+884.367699473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls") pod "isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" (UID: "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069") : secret "isvc-sklearn-scale-raw-7affb-predictor-serving-cert" not found Apr 23 13:47:09.712815 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.712793 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.713304 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.713280 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.721582 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.721518 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ppjc\" (UniqueName: \"kubernetes.io/projected/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kube-api-access-4ppjc\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:09.818085 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.818045 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:47:09.913835 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.913735 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/00dc171a-3be5-4310-91e3-38e0a4724108-message-dumper-raw-35dde-kube-rbac-proxy-sar-config\") pod \"00dc171a-3be5-4310-91e3-38e0a4724108\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " Apr 23 13:47:09.913835 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.913780 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00dc171a-3be5-4310-91e3-38e0a4724108-proxy-tls\") pod \"00dc171a-3be5-4310-91e3-38e0a4724108\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " Apr 23 13:47:09.913835 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.913800 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlr4w\" (UniqueName: \"kubernetes.io/projected/00dc171a-3be5-4310-91e3-38e0a4724108-kube-api-access-nlr4w\") pod \"00dc171a-3be5-4310-91e3-38e0a4724108\" (UID: \"00dc171a-3be5-4310-91e3-38e0a4724108\") " Apr 23 13:47:09.914294 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.914215 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00dc171a-3be5-4310-91e3-38e0a4724108-message-dumper-raw-35dde-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-35dde-kube-rbac-proxy-sar-config") pod "00dc171a-3be5-4310-91e3-38e0a4724108" (UID: "00dc171a-3be5-4310-91e3-38e0a4724108"). InnerVolumeSpecName "message-dumper-raw-35dde-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:47:09.916148 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.916122 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00dc171a-3be5-4310-91e3-38e0a4724108-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "00dc171a-3be5-4310-91e3-38e0a4724108" (UID: "00dc171a-3be5-4310-91e3-38e0a4724108"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:47:09.916244 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:09.916154 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00dc171a-3be5-4310-91e3-38e0a4724108-kube-api-access-nlr4w" (OuterVolumeSpecName: "kube-api-access-nlr4w") pod "00dc171a-3be5-4310-91e3-38e0a4724108" (UID: "00dc171a-3be5-4310-91e3-38e0a4724108"). InnerVolumeSpecName "kube-api-access-nlr4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:47:10.014790 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.014755 2569 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/00dc171a-3be5-4310-91e3-38e0a4724108-message-dumper-raw-35dde-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:47:10.014948 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.014795 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00dc171a-3be5-4310-91e3-38e0a4724108-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:47:10.014948 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.014814 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlr4w\" (UniqueName: \"kubernetes.io/projected/00dc171a-3be5-4310-91e3-38e0a4724108-kube-api-access-nlr4w\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:47:10.216407 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.216285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:10.216582 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:47:10.216449 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-serving-cert: secret "isvc-sklearn-scale-raw-7affb-predictor-serving-cert" not found Apr 23 13:47:10.216582 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:47:10.216522 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls podName:9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069 nodeName:}" failed. No retries permitted until 2026-04-23 13:47:11.216505676 +0000 UTC m=+885.371428122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls") pod "isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" (UID: "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069") : secret "isvc-sklearn-scale-raw-7affb-predictor-serving-cert" not found Apr 23 13:47:10.508207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.508170 2569 generic.go:358] "Generic (PLEG): container finished" podID="00dc171a-3be5-4310-91e3-38e0a4724108" containerID="f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1" exitCode=2 Apr 23 13:47:10.508207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.508200 2569 generic.go:358] "Generic (PLEG): container finished" podID="00dc171a-3be5-4310-91e3-38e0a4724108" containerID="ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa" exitCode=2 Apr 23 13:47:10.508690 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.508242 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" Apr 23 13:47:10.508690 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.508254 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" event={"ID":"00dc171a-3be5-4310-91e3-38e0a4724108","Type":"ContainerDied","Data":"f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1"} Apr 23 13:47:10.508690 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.508291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" event={"ID":"00dc171a-3be5-4310-91e3-38e0a4724108","Type":"ContainerDied","Data":"ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa"} Apr 23 13:47:10.508690 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.508303 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd" event={"ID":"00dc171a-3be5-4310-91e3-38e0a4724108","Type":"ContainerDied","Data":"42ef45a5fe4a42dcb9a836ae8ec05d171e728c9a94150471becfc45a160a2c29"} Apr 23 13:47:10.508690 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.508322 2569 scope.go:117] "RemoveContainer" containerID="f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1" Apr 23 13:47:10.510942 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.510872 2569 generic.go:358] "Generic (PLEG): container finished" podID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerID="6ad3cf977b2111333c670ecc392bad61a42b100601855da80fe179dcd35f90f1" exitCode=2 Apr 23 13:47:10.511080 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.510958 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerDied","Data":"6ad3cf977b2111333c670ecc392bad61a42b100601855da80fe179dcd35f90f1"} Apr 23 13:47:10.517038 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.517016 2569 scope.go:117] "RemoveContainer" containerID="ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa" Apr 23 13:47:10.525468 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.525437 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd"] Apr 23 13:47:10.526033 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.526010 2569 scope.go:117] "RemoveContainer" containerID="f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1" Apr 23 13:47:10.526397 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:47:10.526372 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1\": container with ID starting with f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1 not found: ID does not exist" containerID="f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1" Apr 23 13:47:10.526463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.526408 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1"} err="failed to get container status \"f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1\": rpc error: code = NotFound desc = could not find container \"f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1\": container with ID starting with f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1 not found: ID does not exist" Apr 23 13:47:10.526463 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.526429 2569 scope.go:117] "RemoveContainer" containerID="ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa" Apr 23 13:47:10.526700 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:47:10.526682 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa\": container with ID starting with ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa not found: ID does not exist" containerID="ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa" Apr 23 13:47:10.526745 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.526717 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa"} err="failed to get container status \"ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa\": rpc error: code = NotFound desc = could not find container \"ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa\": container with ID starting with ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa not found: ID does not exist" Apr 23 13:47:10.526745 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.526736 2569 scope.go:117] "RemoveContainer" containerID="f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1" Apr 23 13:47:10.526978 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.526957 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1"} err="failed to get container status \"f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1\": rpc error: code = NotFound desc = could not find container \"f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1\": container with ID starting with f6daa68ee100cde882ef406e4b8d57b25fecc6fbfd882d46748aca3d92eea1d1 not found: ID does not exist" Apr 23 13:47:10.527079 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.526979 2569 scope.go:117] "RemoveContainer" containerID="ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa" Apr 23 13:47:10.527226 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.527200 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa"} err="failed to get container status \"ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa\": rpc error: code = NotFound desc = could not find container \"ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa\": container with ID starting with ce9d0bc7146e470ffd0be2147ef57a81e38c2473a749c365d7cdbd06643784fa not found: ID does not exist" Apr 23 13:47:10.530997 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:10.530974 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-35dde-predictor-5776cbf779-g8nmd"] Apr 23 13:47:11.224324 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:11.224233 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:11.225100 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:11.224991 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 23 13:47:11.227000 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:11.226973 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls\") pod \"isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:11.462099 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:11.462035 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:11.591087 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:11.591043 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd"] Apr 23 13:47:11.593347 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:47:11.593319 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b7b19d9_effb_4ea6_bbf6_e3a0b9f4b069.slice/crio-4f02546d06e5e317f73d0a88e5e7953bc701dfe35e84b634b3786682d635987b WatchSource:0}: Error finding container 4f02546d06e5e317f73d0a88e5e7953bc701dfe35e84b634b3786682d635987b: Status 404 returned error can't find the container with id 4f02546d06e5e317f73d0a88e5e7953bc701dfe35e84b634b3786682d635987b Apr 23 13:47:12.421903 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:12.421864 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00dc171a-3be5-4310-91e3-38e0a4724108" path="/var/lib/kubelet/pods/00dc171a-3be5-4310-91e3-38e0a4724108/volumes" Apr 23 13:47:12.520792 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:12.520749 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" event={"ID":"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069","Type":"ContainerStarted","Data":"f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5"} Apr 23 13:47:12.520792 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:12.520792 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" event={"ID":"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069","Type":"ContainerStarted","Data":"4f02546d06e5e317f73d0a88e5e7953bc701dfe35e84b634b3786682d635987b"} Apr 23 13:47:14.530456 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:14.530420 2569 generic.go:358] "Generic (PLEG): container finished" podID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerID="7814d97ce6becd4255339fab13d7509d41e8bc4cd5b54d3591a7f26a22601722" exitCode=0 Apr 23 13:47:14.530876 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:14.530532 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerDied","Data":"7814d97ce6becd4255339fab13d7509d41e8bc4cd5b54d3591a7f26a22601722"} Apr 23 13:47:16.225765 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:16.225720 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 23 13:47:16.230314 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:16.230268 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:47:16.230663 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:16.230630 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:47:16.538070 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:16.538021 2569 generic.go:358] "Generic (PLEG): container finished" podID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerID="f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5" exitCode=0 Apr 23 13:47:16.538251 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:16.538098 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" event={"ID":"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069","Type":"ContainerDied","Data":"f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5"} Apr 23 13:47:17.544003 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:17.543965 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" event={"ID":"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069","Type":"ContainerStarted","Data":"5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf"} Apr 23 13:47:17.544003 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:17.544010 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" event={"ID":"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069","Type":"ContainerStarted","Data":"dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745"} Apr 23 13:47:17.544468 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:17.544247 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:17.562689 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:17.562637 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podStartSLOduration=8.562622293 podStartE2EDuration="8.562622293s" podCreationTimestamp="2026-04-23 13:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:47:17.561415327 +0000 UTC m=+891.716337806" watchObservedRunningTime="2026-04-23 13:47:17.562622293 +0000 UTC m=+891.717544761" Apr 23 13:47:18.549256 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:18.549225 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:18.550561 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:18.550531 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:47:19.552222 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:19.552177 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:47:21.225945 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:21.225902 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 23 13:47:21.226367 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:21.226032 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:47:24.556460 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:24.556429 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:47:24.557077 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:24.557030 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:47:26.225543 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:26.225498 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 23 13:47:26.230027 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:26.229975 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:47:26.230352 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:26.230328 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:47:26.963995 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:26.963873 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:47:26.992991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:26.966032 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:47:26.992991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:26.967669 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:47:26.992991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:26.972544 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:47:31.225604 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:31.225558 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 23 13:47:34.557184 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:34.557141 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:47:36.225218 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:36.225176 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 23 13:47:36.229967 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:36.229929 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 23 13:47:36.230158 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:36.230041 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:47:36.230158 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:36.230097 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:47:36.230283 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:36.230171 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:47:39.621661 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.621626 2569 generic.go:358] "Generic (PLEG): container finished" podID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerID="9af64fa6340a703964d25525ca6a6de37d272415625ba099f15ef9aae402d30c" exitCode=0 Apr 23 13:47:39.622032 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.621696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerDied","Data":"9af64fa6340a703964d25525ca6a6de37d272415625ba099f15ef9aae402d30c"} Apr 23 13:47:39.676514 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.676489 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:47:39.760474 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.760427 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgn9v\" (UniqueName: \"kubernetes.io/projected/e9d0c878-c391-4fa0-9049-42aecf64aa28-kube-api-access-kgn9v\") pod \"e9d0c878-c391-4fa0-9049-42aecf64aa28\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " Apr 23 13:47:39.760679 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.760514 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e9d0c878-c391-4fa0-9049-42aecf64aa28-isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\") pod \"e9d0c878-c391-4fa0-9049-42aecf64aa28\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " Apr 23 13:47:39.760679 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.760561 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d0c878-c391-4fa0-9049-42aecf64aa28-kserve-provision-location\") pod \"e9d0c878-c391-4fa0-9049-42aecf64aa28\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " Apr 23 13:47:39.760679 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.760618 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9d0c878-c391-4fa0-9049-42aecf64aa28-proxy-tls\") pod \"e9d0c878-c391-4fa0-9049-42aecf64aa28\" (UID: \"e9d0c878-c391-4fa0-9049-42aecf64aa28\") " Apr 23 13:47:39.760960 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.760918 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9d0c878-c391-4fa0-9049-42aecf64aa28-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e9d0c878-c391-4fa0-9049-42aecf64aa28" (UID: "e9d0c878-c391-4fa0-9049-42aecf64aa28"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:39.760960 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.760937 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d0c878-c391-4fa0-9049-42aecf64aa28-isvc-logger-raw-35dde-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-35dde-kube-rbac-proxy-sar-config") pod "e9d0c878-c391-4fa0-9049-42aecf64aa28" (UID: "e9d0c878-c391-4fa0-9049-42aecf64aa28"). InnerVolumeSpecName "isvc-logger-raw-35dde-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:47:39.762850 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.762824 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d0c878-c391-4fa0-9049-42aecf64aa28-kube-api-access-kgn9v" (OuterVolumeSpecName: "kube-api-access-kgn9v") pod "e9d0c878-c391-4fa0-9049-42aecf64aa28" (UID: "e9d0c878-c391-4fa0-9049-42aecf64aa28"). InnerVolumeSpecName "kube-api-access-kgn9v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:47:39.762850 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.762837 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d0c878-c391-4fa0-9049-42aecf64aa28-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e9d0c878-c391-4fa0-9049-42aecf64aa28" (UID: "e9d0c878-c391-4fa0-9049-42aecf64aa28"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:47:39.861537 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.861486 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e9d0c878-c391-4fa0-9049-42aecf64aa28-isvc-logger-raw-35dde-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:47:39.861537 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.861534 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d0c878-c391-4fa0-9049-42aecf64aa28-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:47:39.861537 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.861549 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9d0c878-c391-4fa0-9049-42aecf64aa28-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:47:39.861787 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:39.861563 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kgn9v\" (UniqueName: \"kubernetes.io/projected/e9d0c878-c391-4fa0-9049-42aecf64aa28-kube-api-access-kgn9v\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:47:40.628416 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:40.628374 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" event={"ID":"e9d0c878-c391-4fa0-9049-42aecf64aa28","Type":"ContainerDied","Data":"eaa86f4fa0a79c8867c9b9f4b526296f8770c2e162a415424a3d9b40e161742a"} Apr 23 13:47:40.628813 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:40.628433 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl" Apr 23 13:47:40.628813 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:40.628434 2569 scope.go:117] "RemoveContainer" containerID="9af64fa6340a703964d25525ca6a6de37d272415625ba099f15ef9aae402d30c" Apr 23 13:47:40.639251 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:40.639200 2569 scope.go:117] "RemoveContainer" containerID="6ad3cf977b2111333c670ecc392bad61a42b100601855da80fe179dcd35f90f1" Apr 23 13:47:40.646660 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:40.646637 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl"] Apr 23 13:47:40.647522 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:40.647503 2569 scope.go:117] "RemoveContainer" containerID="7814d97ce6becd4255339fab13d7509d41e8bc4cd5b54d3591a7f26a22601722" Apr 23 13:47:40.650366 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:40.650345 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-35dde-predictor-6dbbcf9c6d-xhlcl"] Apr 23 13:47:40.655008 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:40.654990 2569 scope.go:117] "RemoveContainer" containerID="d6fd91fe2bc6762f257f6019e8969412720a2ab099d17b1740e408252a22f085" Apr 23 13:47:42.420962 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:42.420931 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" path="/var/lib/kubelet/pods/e9d0c878-c391-4fa0-9049-42aecf64aa28/volumes" Apr 23 13:47:44.557935 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:44.557890 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:47:54.557370 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:47:54.557325 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:48:04.557884 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:48:04.557824 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:48:14.557130 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:48:14.557085 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:48:24.557647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:48:24.557595 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:48:26.390196 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:48:26.390161 2569 scope.go:117] "RemoveContainer" containerID="508830a0b6a3c112d223bfb694984cccac275732fc1d5e609cc0f455284a522c" Apr 23 13:48:34.557447 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:48:34.557403 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:48:40.416577 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:48:40.416532 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:48:50.417362 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:48:50.417257 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:49:00.416880 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:00.416831 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:49:10.416907 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:10.416859 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:49:20.416568 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:20.416520 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:49:26.407821 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:26.407782 2569 scope.go:117] "RemoveContainer" containerID="79b83fcfd71e081057f27c3fb99927c33cba1595fba96bbf8636034eee7d0c76" Apr 23 13:49:26.416527 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:26.416503 2569 scope.go:117] "RemoveContainer" containerID="d3f678b5b5ee81e9abe350465a434114fa4581ed0409ad8521dba29817cb32ac" Apr 23 13:49:30.416924 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:30.416866 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:49:40.420495 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:40.420466 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:49:49.722873 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.722834 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd"] Apr 23 13:49:49.723513 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.723289 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" containerID="cri-o://dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745" gracePeriod=30 Apr 23 13:49:49.723513 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.723357 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kube-rbac-proxy" containerID="cri-o://5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf" gracePeriod=30 Apr 23 13:49:49.831639 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.831604 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw"] Apr 23 13:49:49.831965 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.831953 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" Apr 23 13:49:49.832008 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.831967 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" Apr 23 13:49:49.832008 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.831987 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00dc171a-3be5-4310-91e3-38e0a4724108" containerName="kserve-container" Apr 23 13:49:49.832008 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.831993 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="00dc171a-3be5-4310-91e3-38e0a4724108" containerName="kserve-container" Apr 23 13:49:49.832008 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832003 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="storage-initializer" Apr 23 13:49:49.832008 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832009 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="storage-initializer" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832016 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832021 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832031 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00dc171a-3be5-4310-91e3-38e0a4724108" containerName="kube-rbac-proxy" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832036 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="00dc171a-3be5-4310-91e3-38e0a4724108" containerName="kube-rbac-proxy" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832043 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832048 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832119 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="00dc171a-3be5-4310-91e3-38e0a4724108" containerName="kube-rbac-proxy" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832130 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kube-rbac-proxy" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832136 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="kserve-container" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832144 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="00dc171a-3be5-4310-91e3-38e0a4724108" containerName="kserve-container" Apr 23 13:49:49.832186 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.832151 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9d0c878-c391-4fa0-9049-42aecf64aa28" containerName="agent" Apr 23 13:49:49.835408 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.835381 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:49.837768 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.837741 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-2a679f-kube-rbac-proxy-sar-config\"" Apr 23 13:49:49.837915 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.837738 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-2a679f-predictor-serving-cert\"" Apr 23 13:49:49.849628 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.849590 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw"] Apr 23 13:49:49.955683 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.955638 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da669fe3-20f7-405f-8b6a-c75eeba0766a-kserve-provision-location\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:49.955864 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.955689 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da669fe3-20f7-405f-8b6a-c75eeba0766a-proxy-tls\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:49.955864 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.955781 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsmgk\" (UniqueName: \"kubernetes.io/projected/da669fe3-20f7-405f-8b6a-c75eeba0766a-kube-api-access-qsmgk\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:49.955864 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:49.955828 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da669fe3-20f7-405f-8b6a-c75eeba0766a-isvc-primary-2a679f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.056490 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.056448 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsmgk\" (UniqueName: \"kubernetes.io/projected/da669fe3-20f7-405f-8b6a-c75eeba0766a-kube-api-access-qsmgk\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.056744 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.056502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da669fe3-20f7-405f-8b6a-c75eeba0766a-isvc-primary-2a679f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.056744 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.056554 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da669fe3-20f7-405f-8b6a-c75eeba0766a-kserve-provision-location\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.056744 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.056578 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da669fe3-20f7-405f-8b6a-c75eeba0766a-proxy-tls\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.056744 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:49:50.056666 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-2a679f-predictor-serving-cert: secret "isvc-primary-2a679f-predictor-serving-cert" not found Apr 23 13:49:50.056744 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:49:50.056735 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da669fe3-20f7-405f-8b6a-c75eeba0766a-proxy-tls podName:da669fe3-20f7-405f-8b6a-c75eeba0766a nodeName:}" failed. No retries permitted until 2026-04-23 13:49:50.556715935 +0000 UTC m=+1044.711638382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/da669fe3-20f7-405f-8b6a-c75eeba0766a-proxy-tls") pod "isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" (UID: "da669fe3-20f7-405f-8b6a-c75eeba0766a") : secret "isvc-primary-2a679f-predictor-serving-cert" not found Apr 23 13:49:50.057117 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.057093 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da669fe3-20f7-405f-8b6a-c75eeba0766a-kserve-provision-location\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.057368 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.057346 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da669fe3-20f7-405f-8b6a-c75eeba0766a-isvc-primary-2a679f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.065508 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.065487 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsmgk\" (UniqueName: \"kubernetes.io/projected/da669fe3-20f7-405f-8b6a-c75eeba0766a-kube-api-access-qsmgk\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.088929 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.088899 2569 generic.go:358] "Generic (PLEG): container finished" podID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerID="5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf" exitCode=2 Apr 23 13:49:50.089072 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.088969 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" event={"ID":"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069","Type":"ContainerDied","Data":"5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf"} Apr 23 13:49:50.416636 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.416533 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 23 13:49:50.561569 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.561530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da669fe3-20f7-405f-8b6a-c75eeba0766a-proxy-tls\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.564151 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.564125 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da669fe3-20f7-405f-8b6a-c75eeba0766a-proxy-tls\") pod \"isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.747867 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.747781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:50.871650 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.871601 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw"] Apr 23 13:49:50.874135 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:49:50.874102 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda669fe3_20f7_405f_8b6a_c75eeba0766a.slice/crio-0f32e28681b3e52bfd7eda7658c3694fc154b251c06a05090b9db3dada661f5d WatchSource:0}: Error finding container 0f32e28681b3e52bfd7eda7658c3694fc154b251c06a05090b9db3dada661f5d: Status 404 returned error can't find the container with id 0f32e28681b3e52bfd7eda7658c3694fc154b251c06a05090b9db3dada661f5d Apr 23 13:49:50.876100 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:50.876078 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:49:51.094162 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:51.094125 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" event={"ID":"da669fe3-20f7-405f-8b6a-c75eeba0766a","Type":"ContainerStarted","Data":"173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836"} Apr 23 13:49:51.094162 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:51.094161 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" event={"ID":"da669fe3-20f7-405f-8b6a-c75eeba0766a","Type":"ContainerStarted","Data":"0f32e28681b3e52bfd7eda7658c3694fc154b251c06a05090b9db3dada661f5d"} Apr 23 13:49:54.553249 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:54.553204 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 23 13:49:55.108136 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:55.108097 2569 generic.go:358] "Generic (PLEG): container finished" podID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerID="173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836" exitCode=0 Apr 23 13:49:55.108424 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:55.108168 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" event={"ID":"da669fe3-20f7-405f-8b6a-c75eeba0766a","Type":"ContainerDied","Data":"173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836"} Apr 23 13:49:56.114127 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:56.114089 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" event={"ID":"da669fe3-20f7-405f-8b6a-c75eeba0766a","Type":"ContainerStarted","Data":"c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27"} Apr 23 13:49:56.114127 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:56.114135 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" event={"ID":"da669fe3-20f7-405f-8b6a-c75eeba0766a","Type":"ContainerStarted","Data":"86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de"} Apr 23 13:49:56.114642 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:56.114394 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:56.114642 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:56.114538 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:49:56.115852 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:56.115826 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 23 13:49:56.133206 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:56.133156 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podStartSLOduration=7.133138725 podStartE2EDuration="7.133138725s" podCreationTimestamp="2026-04-23 13:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:49:56.131839118 +0000 UTC m=+1050.286761585" watchObservedRunningTime="2026-04-23 13:49:56.133138725 +0000 UTC m=+1050.288061193" Apr 23 13:49:57.117249 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:57.117210 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 23 13:49:59.075324 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.075296 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:49:59.125888 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.125796 2569 generic.go:358] "Generic (PLEG): container finished" podID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerID="dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745" exitCode=0 Apr 23 13:49:59.125888 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.125881 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" Apr 23 13:49:59.126119 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.125880 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" event={"ID":"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069","Type":"ContainerDied","Data":"dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745"} Apr 23 13:49:59.126119 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.125995 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd" event={"ID":"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069","Type":"ContainerDied","Data":"4f02546d06e5e317f73d0a88e5e7953bc701dfe35e84b634b3786682d635987b"} Apr 23 13:49:59.126119 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.126018 2569 scope.go:117] "RemoveContainer" containerID="5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf" Apr 23 13:49:59.135461 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.135347 2569 scope.go:117] "RemoveContainer" containerID="dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745" Apr 23 13:49:59.145623 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.145601 2569 scope.go:117] "RemoveContainer" containerID="f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5" Apr 23 13:49:59.153574 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.153552 2569 scope.go:117] "RemoveContainer" containerID="5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf" Apr 23 13:49:59.153874 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:49:59.153854 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf\": container with ID starting with 5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf not found: ID does not exist" containerID="5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf" Apr 23 13:49:59.153984 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.153887 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf"} err="failed to get container status \"5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf\": rpc error: code = NotFound desc = could not find container \"5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf\": container with ID starting with 5804dbe876922cf8c958965083ee0689fd59b5902834b66bf7a53fbd811ea8bf not found: ID does not exist" Apr 23 13:49:59.153984 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.153912 2569 scope.go:117] "RemoveContainer" containerID="dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745" Apr 23 13:49:59.154306 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:49:59.154281 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745\": container with ID starting with dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745 not found: ID does not exist" containerID="dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745" Apr 23 13:49:59.154352 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.154315 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745"} err="failed to get container status \"dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745\": rpc error: code = NotFound desc = could not find container \"dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745\": container with ID starting with dea1e2a68b7e4e02bac9039641b51d8887048448ce2c9c835de03ab97f9b2745 not found: ID does not exist" Apr 23 13:49:59.154352 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.154332 2569 scope.go:117] "RemoveContainer" containerID="f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5" Apr 23 13:49:59.154565 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:49:59.154547 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5\": container with ID starting with f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5 not found: ID does not exist" containerID="f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5" Apr 23 13:49:59.154619 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.154573 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5"} err="failed to get container status \"f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5\": rpc error: code = NotFound desc = could not find container \"f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5\": container with ID starting with f6bd1ef99dc44656dada11bce210bd8d66aa88e33b1d970645ff347a4a33b9c5 not found: ID does not exist" Apr 23 13:49:59.230766 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.230712 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\") pod \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " Apr 23 13:49:59.230766 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.230775 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kserve-provision-location\") pod \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " Apr 23 13:49:59.231016 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.230809 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls\") pod \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " Apr 23 13:49:59.231016 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.230861 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ppjc\" (UniqueName: \"kubernetes.io/projected/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kube-api-access-4ppjc\") pod \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\" (UID: \"9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069\") " Apr 23 13:49:59.231163 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.231146 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" (UID: "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:49:59.231217 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.231188 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config") pod "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" (UID: "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069"). InnerVolumeSpecName "isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:49:59.233242 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.233211 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" (UID: "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:49:59.233355 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.233216 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kube-api-access-4ppjc" (OuterVolumeSpecName: "kube-api-access-4ppjc") pod "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" (UID: "9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069"). InnerVolumeSpecName "kube-api-access-4ppjc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:49:59.332481 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.332438 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-isvc-sklearn-scale-raw-7affb-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:49:59.332481 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.332472 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:49:59.332481 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.332484 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:49:59.332724 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.332496 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4ppjc\" (UniqueName: \"kubernetes.io/projected/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069-kube-api-access-4ppjc\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:49:59.448283 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.448252 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd"] Apr 23 13:49:59.452917 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:49:59.452879 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-7affb-predictor-745ccbf6db-s6fkd"] Apr 23 13:50:00.421045 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:50:00.420996 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" path="/var/lib/kubelet/pods/9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069/volumes" Apr 23 13:50:02.121929 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:50:02.121896 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:50:02.122555 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:50:02.122521 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 23 13:50:12.123225 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:50:12.123128 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 23 13:50:22.122532 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:50:22.122491 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 23 13:50:32.122597 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:50:32.122551 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 23 13:50:42.122469 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:50:42.122418 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 23 13:50:52.123341 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:50:52.123296 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 23 13:51:02.124071 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:02.124025 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:51:09.952429 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.952386 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj"] Apr 23 13:51:09.952927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.952722 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="storage-initializer" Apr 23 13:51:09.952927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.952734 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="storage-initializer" Apr 23 13:51:09.952927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.952755 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" Apr 23 13:51:09.952927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.952760 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" Apr 23 13:51:09.952927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.952767 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kube-rbac-proxy" Apr 23 13:51:09.952927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.952772 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kube-rbac-proxy" Apr 23 13:51:09.952927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.952829 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kube-rbac-proxy" Apr 23 13:51:09.952927 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.952839 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b7b19d9-effb-4ea6-bbf6-e3a0b9f4b069" containerName="kserve-container" Apr 23 13:51:09.956402 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.956382 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:09.958775 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.958744 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-2a679f-predictor-serving-cert\"" Apr 23 13:51:09.958926 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.958787 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-2a679f-kube-rbac-proxy-sar-config\"" Apr 23 13:51:09.958926 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.958847 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 13:51:09.959037 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.958948 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-2a679f-dockercfg-l6vfq\"" Apr 23 13:51:09.959113 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.959075 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-2a679f\"" Apr 23 13:51:09.965007 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:09.964917 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj"] Apr 23 13:51:10.107288 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.107240 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kserve-provision-location\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.107478 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.107332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-isvc-secondary-2a679f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.107478 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.107358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-cabundle-cert\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.107478 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.107395 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-proxy-tls\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.107478 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.107420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x4x5\" (UniqueName: \"kubernetes.io/projected/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kube-api-access-7x4x5\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.208713 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.208607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-isvc-secondary-2a679f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.208713 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.208656 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-cabundle-cert\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.208713 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.208696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-proxy-tls\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.208991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.208720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x4x5\" (UniqueName: \"kubernetes.io/projected/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kube-api-access-7x4x5\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.208991 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.208749 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kserve-provision-location\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.209256 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.209220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kserve-provision-location\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.209416 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.209392 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-isvc-secondary-2a679f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.209486 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.209437 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-cabundle-cert\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.211383 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.211363 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-proxy-tls\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.217009 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.216982 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x4x5\" (UniqueName: \"kubernetes.io/projected/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kube-api-access-7x4x5\") pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.269431 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.269390 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:10.405207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:10.405178 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj"] Apr 23 13:51:10.406959 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:51:10.406917 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b89125_1e64_490b_9d91_2d7c4d9dcccf.slice/crio-d96093d1fe48916217c9b1c06c3719ba880c41a0c0aa139d30b7ed5820b009bd WatchSource:0}: Error finding container d96093d1fe48916217c9b1c06c3719ba880c41a0c0aa139d30b7ed5820b009bd: Status 404 returned error can't find the container with id d96093d1fe48916217c9b1c06c3719ba880c41a0c0aa139d30b7ed5820b009bd Apr 23 13:51:11.380566 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:11.380527 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" event={"ID":"f4b89125-1e64-490b-9d91-2d7c4d9dcccf","Type":"ContainerStarted","Data":"0b9ada0316e99d4d5c0b8997803eb6ab770af8521f2f0345a05a85299a934769"} Apr 23 13:51:11.380566 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:11.380567 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" event={"ID":"f4b89125-1e64-490b-9d91-2d7c4d9dcccf","Type":"ContainerStarted","Data":"d96093d1fe48916217c9b1c06c3719ba880c41a0c0aa139d30b7ed5820b009bd"} Apr 23 13:51:14.392689 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:14.392657 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_f4b89125-1e64-490b-9d91-2d7c4d9dcccf/storage-initializer/0.log" Apr 23 13:51:14.393082 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:14.392698 2569 generic.go:358] "Generic (PLEG): container finished" podID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" containerID="0b9ada0316e99d4d5c0b8997803eb6ab770af8521f2f0345a05a85299a934769" exitCode=1 Apr 23 13:51:14.393082 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:14.392780 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" event={"ID":"f4b89125-1e64-490b-9d91-2d7c4d9dcccf","Type":"ContainerDied","Data":"0b9ada0316e99d4d5c0b8997803eb6ab770af8521f2f0345a05a85299a934769"} Apr 23 13:51:15.398135 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:15.398104 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_f4b89125-1e64-490b-9d91-2d7c4d9dcccf/storage-initializer/0.log" Apr 23 13:51:15.398669 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:15.398165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" event={"ID":"f4b89125-1e64-490b-9d91-2d7c4d9dcccf","Type":"ContainerStarted","Data":"3dc765f1d94b4bda6dc777877d01c41f59a73c9ea42c38ac973e857c21e3285a"} Apr 23 13:51:21.420909 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:21.420879 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_f4b89125-1e64-490b-9d91-2d7c4d9dcccf/storage-initializer/1.log" Apr 23 13:51:21.421330 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:21.421200 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_f4b89125-1e64-490b-9d91-2d7c4d9dcccf/storage-initializer/0.log" Apr 23 13:51:21.421330 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:21.421233 2569 generic.go:358] "Generic (PLEG): container finished" podID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" containerID="3dc765f1d94b4bda6dc777877d01c41f59a73c9ea42c38ac973e857c21e3285a" exitCode=1 Apr 23 13:51:21.421330 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:21.421288 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" event={"ID":"f4b89125-1e64-490b-9d91-2d7c4d9dcccf","Type":"ContainerDied","Data":"3dc765f1d94b4bda6dc777877d01c41f59a73c9ea42c38ac973e857c21e3285a"} Apr 23 13:51:21.421330 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:21.421328 2569 scope.go:117] "RemoveContainer" containerID="0b9ada0316e99d4d5c0b8997803eb6ab770af8521f2f0345a05a85299a934769" Apr 23 13:51:21.421749 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:21.421731 2569 scope.go:117] "RemoveContainer" containerID="0b9ada0316e99d4d5c0b8997803eb6ab770af8521f2f0345a05a85299a934769" Apr 23 13:51:21.432290 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:21.432254 2569 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_kserve-ci-e2e-test_f4b89125-1e64-490b-9d91-2d7c4d9dcccf_0 in pod sandbox d96093d1fe48916217c9b1c06c3719ba880c41a0c0aa139d30b7ed5820b009bd from index: no such id: '0b9ada0316e99d4d5c0b8997803eb6ab770af8521f2f0345a05a85299a934769'" containerID="0b9ada0316e99d4d5c0b8997803eb6ab770af8521f2f0345a05a85299a934769" Apr 23 13:51:21.432366 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:21.432312 2569 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_kserve-ci-e2e-test_f4b89125-1e64-490b-9d91-2d7c4d9dcccf_0 in pod sandbox d96093d1fe48916217c9b1c06c3719ba880c41a0c0aa139d30b7ed5820b009bd from index: no such id: '0b9ada0316e99d4d5c0b8997803eb6ab770af8521f2f0345a05a85299a934769'; Skipping pod \"isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_kserve-ci-e2e-test(f4b89125-1e64-490b-9d91-2d7c4d9dcccf)\"" logger="UnhandledError" Apr 23 13:51:21.433651 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:21.433633 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_kserve-ci-e2e-test(f4b89125-1e64-490b-9d91-2d7c4d9dcccf)\"" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" podUID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" Apr 23 13:51:22.425824 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:22.425796 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_f4b89125-1e64-490b-9d91-2d7c4d9dcccf/storage-initializer/1.log" Apr 23 13:51:26.035139 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.035094 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw"] Apr 23 13:51:26.035615 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.035526 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" containerID="cri-o://86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de" gracePeriod=30 Apr 23 13:51:26.035615 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.035566 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kube-rbac-proxy" containerID="cri-o://c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27" gracePeriod=30 Apr 23 13:51:26.098408 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.098372 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj"] Apr 23 13:51:26.180174 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.179674 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2"] Apr 23 13:51:26.184804 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.184776 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.187816 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.187662 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-223bfc-predictor-serving-cert\"" Apr 23 13:51:26.187816 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.187821 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-223bfc\"" Apr 23 13:51:26.188211 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.188045 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-223bfc-dockercfg-d5kxh\"" Apr 23 13:51:26.188301 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.188282 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\"" Apr 23 13:51:26.195083 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.194722 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2"] Apr 23 13:51:26.252953 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.252920 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blf8p\" (UniqueName: \"kubernetes.io/projected/08252d44-397b-4cac-bf76-e4f024214bbc-kube-api-access-blf8p\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.253162 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.252995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-cabundle-cert\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.253162 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.253084 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08252d44-397b-4cac-bf76-e4f024214bbc-kserve-provision-location\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.253162 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.253128 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08252d44-397b-4cac-bf76-e4f024214bbc-proxy-tls\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.253162 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.253156 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.254609 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.254588 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_f4b89125-1e64-490b-9d91-2d7c4d9dcccf/storage-initializer/1.log" Apr 23 13:51:26.254699 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.254651 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:26.354246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354149 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-cabundle-cert\") pod \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " Apr 23 13:51:26.354246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354207 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-isvc-secondary-2a679f-kube-rbac-proxy-sar-config\") pod \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " Apr 23 13:51:26.354246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354242 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kserve-provision-location\") pod \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " Apr 23 13:51:26.354530 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354370 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-proxy-tls\") pod \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " Apr 23 13:51:26.354530 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354488 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x4x5\" (UniqueName: \"kubernetes.io/projected/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kube-api-access-7x4x5\") pod \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\" (UID: \"f4b89125-1e64-490b-9d91-2d7c4d9dcccf\") " Apr 23 13:51:26.354530 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354517 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f4b89125-1e64-490b-9d91-2d7c4d9dcccf" (UID: "f4b89125-1e64-490b-9d91-2d7c4d9dcccf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:26.354686 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354631 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f4b89125-1e64-490b-9d91-2d7c4d9dcccf" (UID: "f4b89125-1e64-490b-9d91-2d7c4d9dcccf"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:51:26.354686 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blf8p\" (UniqueName: \"kubernetes.io/projected/08252d44-397b-4cac-bf76-e4f024214bbc-kube-api-access-blf8p\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.354686 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354653 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-isvc-secondary-2a679f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-2a679f-kube-rbac-proxy-sar-config") pod "f4b89125-1e64-490b-9d91-2d7c4d9dcccf" (UID: "f4b89125-1e64-490b-9d91-2d7c4d9dcccf"). InnerVolumeSpecName "isvc-secondary-2a679f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:51:26.354844 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-cabundle-cert\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.354844 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08252d44-397b-4cac-bf76-e4f024214bbc-kserve-provision-location\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.354973 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08252d44-397b-4cac-bf76-e4f024214bbc-proxy-tls\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.354973 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354878 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.354973 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354947 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-cabundle-cert\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:26.354973 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354965 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-isvc-secondary-2a679f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:26.355234 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.354981 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:26.355438 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.355409 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-cabundle-cert\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.355756 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.355730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08252d44-397b-4cac-bf76-e4f024214bbc-kserve-provision-location\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.356879 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.356854 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f4b89125-1e64-490b-9d91-2d7c4d9dcccf" (UID: "f4b89125-1e64-490b-9d91-2d7c4d9dcccf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:51:26.356879 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.356870 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kube-api-access-7x4x5" (OuterVolumeSpecName: "kube-api-access-7x4x5") pod "f4b89125-1e64-490b-9d91-2d7c4d9dcccf" (UID: "f4b89125-1e64-490b-9d91-2d7c4d9dcccf"). InnerVolumeSpecName "kube-api-access-7x4x5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:51:26.357452 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.357433 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\"" Apr 23 13:51:26.357559 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.357544 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-223bfc-predictor-serving-cert\"" Apr 23 13:51:26.364201 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.364173 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blf8p\" (UniqueName: \"kubernetes.io/projected/08252d44-397b-4cac-bf76-e4f024214bbc-kube-api-access-blf8p\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.365735 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:26.365709 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-serving-cert: secret "isvc-init-fail-223bfc-predictor-serving-cert" not found Apr 23 13:51:26.365859 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:26.365782 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08252d44-397b-4cac-bf76-e4f024214bbc-proxy-tls podName:08252d44-397b-4cac-bf76-e4f024214bbc nodeName:}" failed. No retries permitted until 2026-04-23 13:51:26.865761061 +0000 UTC m=+1141.020683511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/08252d44-397b-4cac-bf76-e4f024214bbc-proxy-tls") pod "isvc-init-fail-223bfc-predictor-86594bf96-xczs2" (UID: "08252d44-397b-4cac-bf76-e4f024214bbc") : secret "isvc-init-fail-223bfc-predictor-serving-cert" not found Apr 23 13:51:26.366354 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.366330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.441337 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.441308 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj_f4b89125-1e64-490b-9d91-2d7c4d9dcccf/storage-initializer/1.log" Apr 23 13:51:26.441491 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.441443 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" event={"ID":"f4b89125-1e64-490b-9d91-2d7c4d9dcccf","Type":"ContainerDied","Data":"d96093d1fe48916217c9b1c06c3719ba880c41a0c0aa139d30b7ed5820b009bd"} Apr 23 13:51:26.441534 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.441502 2569 scope.go:117] "RemoveContainer" containerID="3dc765f1d94b4bda6dc777877d01c41f59a73c9ea42c38ac973e857c21e3285a" Apr 23 13:51:26.441534 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.441460 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj" Apr 23 13:51:26.443869 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.443835 2569 generic.go:358] "Generic (PLEG): container finished" podID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerID="c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27" exitCode=2 Apr 23 13:51:26.444044 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.443926 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" event={"ID":"da669fe3-20f7-405f-8b6a-c75eeba0766a","Type":"ContainerDied","Data":"c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27"} Apr 23 13:51:26.456352 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.456326 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:26.456352 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.456353 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7x4x5\" (UniqueName: \"kubernetes.io/projected/f4b89125-1e64-490b-9d91-2d7c4d9dcccf-kube-api-access-7x4x5\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:26.476118 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.476073 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj"] Apr 23 13:51:26.481333 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.481300 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2a679f-predictor-555cd5b5f8-65tqj"] Apr 23 13:51:26.960519 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.960459 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08252d44-397b-4cac-bf76-e4f024214bbc-proxy-tls\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:26.963152 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:26.963130 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08252d44-397b-4cac-bf76-e4f024214bbc-proxy-tls\") pod \"isvc-init-fail-223bfc-predictor-86594bf96-xczs2\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:27.103346 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:27.103308 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-223bfc-dockercfg-d5kxh\"" Apr 23 13:51:27.111491 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:27.111443 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:27.117980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:27.117936 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.41:8643/healthz\": dial tcp 10.134.0.41:8643: connect: connection refused" Apr 23 13:51:27.245194 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:27.245117 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2"] Apr 23 13:51:27.248197 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:51:27.248163 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08252d44_397b_4cac_bf76_e4f024214bbc.slice/crio-e9abc0e6837af5c542f11dbaf53678696060e3477854af11354b8c1c315cbf39 WatchSource:0}: Error finding container e9abc0e6837af5c542f11dbaf53678696060e3477854af11354b8c1c315cbf39: Status 404 returned error can't find the container with id e9abc0e6837af5c542f11dbaf53678696060e3477854af11354b8c1c315cbf39 Apr 23 13:51:27.253011 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:27.252985 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-223bfc\"" Apr 23 13:51:27.449908 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:27.449865 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" event={"ID":"08252d44-397b-4cac-bf76-e4f024214bbc","Type":"ContainerStarted","Data":"b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79"} Apr 23 13:51:27.449908 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:27.449911 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" event={"ID":"08252d44-397b-4cac-bf76-e4f024214bbc","Type":"ContainerStarted","Data":"e9abc0e6837af5c542f11dbaf53678696060e3477854af11354b8c1c315cbf39"} Apr 23 13:51:28.421522 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:28.421486 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" path="/var/lib/kubelet/pods/f4b89125-1e64-490b-9d91-2d7c4d9dcccf/volumes" Apr 23 13:51:30.881292 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:30.881266 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:51:30.995366 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:30.995272 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da669fe3-20f7-405f-8b6a-c75eeba0766a-proxy-tls\") pod \"da669fe3-20f7-405f-8b6a-c75eeba0766a\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " Apr 23 13:51:30.995366 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:30.995334 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da669fe3-20f7-405f-8b6a-c75eeba0766a-kserve-provision-location\") pod \"da669fe3-20f7-405f-8b6a-c75eeba0766a\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " Apr 23 13:51:30.995366 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:30.995356 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsmgk\" (UniqueName: \"kubernetes.io/projected/da669fe3-20f7-405f-8b6a-c75eeba0766a-kube-api-access-qsmgk\") pod \"da669fe3-20f7-405f-8b6a-c75eeba0766a\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " Apr 23 13:51:30.995606 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:30.995436 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da669fe3-20f7-405f-8b6a-c75eeba0766a-isvc-primary-2a679f-kube-rbac-proxy-sar-config\") pod \"da669fe3-20f7-405f-8b6a-c75eeba0766a\" (UID: \"da669fe3-20f7-405f-8b6a-c75eeba0766a\") " Apr 23 13:51:30.995734 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:30.995705 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da669fe3-20f7-405f-8b6a-c75eeba0766a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da669fe3-20f7-405f-8b6a-c75eeba0766a" (UID: "da669fe3-20f7-405f-8b6a-c75eeba0766a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:30.995805 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:30.995786 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da669fe3-20f7-405f-8b6a-c75eeba0766a-isvc-primary-2a679f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-2a679f-kube-rbac-proxy-sar-config") pod "da669fe3-20f7-405f-8b6a-c75eeba0766a" (UID: "da669fe3-20f7-405f-8b6a-c75eeba0766a"). InnerVolumeSpecName "isvc-primary-2a679f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:51:30.997744 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:30.997717 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da669fe3-20f7-405f-8b6a-c75eeba0766a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "da669fe3-20f7-405f-8b6a-c75eeba0766a" (UID: "da669fe3-20f7-405f-8b6a-c75eeba0766a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:51:30.997744 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:30.997731 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da669fe3-20f7-405f-8b6a-c75eeba0766a-kube-api-access-qsmgk" (OuterVolumeSpecName: "kube-api-access-qsmgk") pod "da669fe3-20f7-405f-8b6a-c75eeba0766a" (UID: "da669fe3-20f7-405f-8b6a-c75eeba0766a"). InnerVolumeSpecName "kube-api-access-qsmgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:51:31.096281 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.096240 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-2a679f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da669fe3-20f7-405f-8b6a-c75eeba0766a-isvc-primary-2a679f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:31.096281 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.096273 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da669fe3-20f7-405f-8b6a-c75eeba0766a-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:31.096281 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.096283 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da669fe3-20f7-405f-8b6a-c75eeba0766a-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:31.096281 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.096292 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qsmgk\" (UniqueName: \"kubernetes.io/projected/da669fe3-20f7-405f-8b6a-c75eeba0766a-kube-api-access-qsmgk\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:31.468325 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.468278 2569 generic.go:358] "Generic (PLEG): container finished" podID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerID="86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de" exitCode=0 Apr 23 13:51:31.468515 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.468361 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" Apr 23 13:51:31.468515 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.468361 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" event={"ID":"da669fe3-20f7-405f-8b6a-c75eeba0766a","Type":"ContainerDied","Data":"86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de"} Apr 23 13:51:31.468515 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.468404 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw" event={"ID":"da669fe3-20f7-405f-8b6a-c75eeba0766a","Type":"ContainerDied","Data":"0f32e28681b3e52bfd7eda7658c3694fc154b251c06a05090b9db3dada661f5d"} Apr 23 13:51:31.468515 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.468420 2569 scope.go:117] "RemoveContainer" containerID="c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27" Apr 23 13:51:31.477475 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.477455 2569 scope.go:117] "RemoveContainer" containerID="86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de" Apr 23 13:51:31.485960 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.485938 2569 scope.go:117] "RemoveContainer" containerID="173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836" Apr 23 13:51:31.491029 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.490997 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw"] Apr 23 13:51:31.494042 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.494013 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2a679f-predictor-f5ddf7b9c-l2jxw"] Apr 23 13:51:31.495180 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.495164 2569 scope.go:117] "RemoveContainer" containerID="c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27" Apr 23 13:51:31.495502 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:31.495477 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27\": container with ID starting with c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27 not found: ID does not exist" containerID="c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27" Apr 23 13:51:31.495581 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.495518 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27"} err="failed to get container status \"c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27\": rpc error: code = NotFound desc = could not find container \"c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27\": container with ID starting with c4a7fb0e2f0dd154f02f69a2ef65a8f8650435623214a04014983eeb80319e27 not found: ID does not exist" Apr 23 13:51:31.495581 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.495548 2569 scope.go:117] "RemoveContainer" containerID="86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de" Apr 23 13:51:31.495846 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:31.495829 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de\": container with ID starting with 86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de not found: ID does not exist" containerID="86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de" Apr 23 13:51:31.495910 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.495855 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de"} err="failed to get container status \"86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de\": rpc error: code = NotFound desc = could not find container \"86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de\": container with ID starting with 86675f3cd443252ada2431a56c6eb2fe1f53bb8fdb91ef7cbbf7162bcf33c1de not found: ID does not exist" Apr 23 13:51:31.495910 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.495879 2569 scope.go:117] "RemoveContainer" containerID="173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836" Apr 23 13:51:31.496127 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:31.496110 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836\": container with ID starting with 173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836 not found: ID does not exist" containerID="173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836" Apr 23 13:51:31.496204 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:31.496135 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836"} err="failed to get container status \"173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836\": rpc error: code = NotFound desc = could not find container \"173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836\": container with ID starting with 173c6959d0590e40855511513b80e3211849cb3fc0242dea0427ca9e0fda4836 not found: ID does not exist" Apr 23 13:51:32.420229 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:32.420194 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" path="/var/lib/kubelet/pods/da669fe3-20f7-405f-8b6a-c75eeba0766a/volumes" Apr 23 13:51:34.481758 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:34.481731 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-223bfc-predictor-86594bf96-xczs2_08252d44-397b-4cac-bf76-e4f024214bbc/storage-initializer/0.log" Apr 23 13:51:34.482222 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:34.481768 2569 generic.go:358] "Generic (PLEG): container finished" podID="08252d44-397b-4cac-bf76-e4f024214bbc" containerID="b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79" exitCode=1 Apr 23 13:51:34.482222 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:34.481850 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" event={"ID":"08252d44-397b-4cac-bf76-e4f024214bbc","Type":"ContainerDied","Data":"b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79"} Apr 23 13:51:35.489631 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:35.489600 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-223bfc-predictor-86594bf96-xczs2_08252d44-397b-4cac-bf76-e4f024214bbc/storage-initializer/0.log" Apr 23 13:51:35.490146 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:35.489664 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" event={"ID":"08252d44-397b-4cac-bf76-e4f024214bbc","Type":"ContainerStarted","Data":"f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a"} Apr 23 13:51:36.218284 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.218253 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2"] Apr 23 13:51:36.492992 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.492869 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" podUID="08252d44-397b-4cac-bf76-e4f024214bbc" containerName="storage-initializer" containerID="cri-o://f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a" gracePeriod=30 Apr 23 13:51:36.940477 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.940453 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-223bfc-predictor-86594bf96-xczs2_08252d44-397b-4cac-bf76-e4f024214bbc/storage-initializer/1.log" Apr 23 13:51:36.940832 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.940816 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-223bfc-predictor-86594bf96-xczs2_08252d44-397b-4cac-bf76-e4f024214bbc/storage-initializer/0.log" Apr 23 13:51:36.940896 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.940881 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:36.984356 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984324 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt"] Apr 23 13:51:36.984649 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984637 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08252d44-397b-4cac-bf76-e4f024214bbc" containerName="storage-initializer" Apr 23 13:51:36.984703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984651 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="08252d44-397b-4cac-bf76-e4f024214bbc" containerName="storage-initializer" Apr 23 13:51:36.984703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984666 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="storage-initializer" Apr 23 13:51:36.984703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984672 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="storage-initializer" Apr 23 13:51:36.984703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984682 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kube-rbac-proxy" Apr 23 13:51:36.984703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984687 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kube-rbac-proxy" Apr 23 13:51:36.984703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984697 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" Apr 23 13:51:36.984703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984702 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" Apr 23 13:51:36.984703 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984708 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" containerName="storage-initializer" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984713 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" containerName="storage-initializer" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984722 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" containerName="storage-initializer" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984728 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" containerName="storage-initializer" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984775 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" containerName="storage-initializer" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984783 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="08252d44-397b-4cac-bf76-e4f024214bbc" containerName="storage-initializer" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984790 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4b89125-1e64-490b-9d91-2d7c4d9dcccf" containerName="storage-initializer" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984797 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kube-rbac-proxy" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984804 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="da669fe3-20f7-405f-8b6a-c75eeba0766a" containerName="kserve-container" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984855 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08252d44-397b-4cac-bf76-e4f024214bbc" containerName="storage-initializer" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984861 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="08252d44-397b-4cac-bf76-e4f024214bbc" containerName="storage-initializer" Apr 23 13:51:36.984980 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.984916 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="08252d44-397b-4cac-bf76-e4f024214bbc" containerName="storage-initializer" Apr 23 13:51:36.988051 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.988025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:36.990733 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.990712 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-blqr2\"" Apr 23 13:51:36.990860 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.990712 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-ed1e5-predictor-serving-cert\"" Apr 23 13:51:36.990860 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.990712 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\"" Apr 23 13:51:36.996118 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:36.996093 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt"] Apr 23 13:51:37.047285 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047184 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08252d44-397b-4cac-bf76-e4f024214bbc-proxy-tls\") pod \"08252d44-397b-4cac-bf76-e4f024214bbc\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " Apr 23 13:51:37.047285 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047247 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\") pod \"08252d44-397b-4cac-bf76-e4f024214bbc\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " Apr 23 13:51:37.047285 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047269 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-cabundle-cert\") pod \"08252d44-397b-4cac-bf76-e4f024214bbc\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " Apr 23 13:51:37.047573 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047402 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08252d44-397b-4cac-bf76-e4f024214bbc-kserve-provision-location\") pod \"08252d44-397b-4cac-bf76-e4f024214bbc\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " Apr 23 13:51:37.047573 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047461 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blf8p\" (UniqueName: \"kubernetes.io/projected/08252d44-397b-4cac-bf76-e4f024214bbc-kube-api-access-blf8p\") pod \"08252d44-397b-4cac-bf76-e4f024214bbc\" (UID: \"08252d44-397b-4cac-bf76-e4f024214bbc\") " Apr 23 13:51:37.047682 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7ht2\" (UniqueName: \"kubernetes.io/projected/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kube-api-access-c7ht2\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.047682 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047647 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08252d44-397b-4cac-bf76-e4f024214bbc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "08252d44-397b-4cac-bf76-e4f024214bbc" (UID: "08252d44-397b-4cac-bf76-e4f024214bbc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:37.047682 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047654 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "08252d44-397b-4cac-bf76-e4f024214bbc" (UID: "08252d44-397b-4cac-bf76-e4f024214bbc"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:51:37.047682 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047669 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kserve-provision-location\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.047682 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047669 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-isvc-init-fail-223bfc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-223bfc-kube-rbac-proxy-sar-config") pod "08252d44-397b-4cac-bf76-e4f024214bbc" (UID: "08252d44-397b-4cac-bf76-e4f024214bbc"). InnerVolumeSpecName "isvc-init-fail-223bfc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:51:37.047932 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047755 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6142f85-8f3b-40f6-92d3-9758f42d36ba-raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.047932 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047850 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6142f85-8f3b-40f6-92d3-9758f42d36ba-proxy-tls\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.047932 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047919 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-isvc-init-fail-223bfc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:37.047932 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047932 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/08252d44-397b-4cac-bf76-e4f024214bbc-cabundle-cert\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:37.048090 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.047942 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08252d44-397b-4cac-bf76-e4f024214bbc-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:37.049559 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.049536 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08252d44-397b-4cac-bf76-e4f024214bbc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "08252d44-397b-4cac-bf76-e4f024214bbc" (UID: "08252d44-397b-4cac-bf76-e4f024214bbc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:51:37.049674 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.049652 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08252d44-397b-4cac-bf76-e4f024214bbc-kube-api-access-blf8p" (OuterVolumeSpecName: "kube-api-access-blf8p") pod "08252d44-397b-4cac-bf76-e4f024214bbc" (UID: "08252d44-397b-4cac-bf76-e4f024214bbc"). InnerVolumeSpecName "kube-api-access-blf8p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:51:37.148517 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.148477 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7ht2\" (UniqueName: \"kubernetes.io/projected/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kube-api-access-c7ht2\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.148517 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.148523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kserve-provision-location\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.148736 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.148544 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6142f85-8f3b-40f6-92d3-9758f42d36ba-raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.148736 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.148582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6142f85-8f3b-40f6-92d3-9758f42d36ba-proxy-tls\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.148736 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.148626 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-blf8p\" (UniqueName: \"kubernetes.io/projected/08252d44-397b-4cac-bf76-e4f024214bbc-kube-api-access-blf8p\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:37.148736 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.148637 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08252d44-397b-4cac-bf76-e4f024214bbc-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:51:37.148942 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.148922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kserve-provision-location\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.149281 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.149261 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6142f85-8f3b-40f6-92d3-9758f42d36ba-raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.151099 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.151084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6142f85-8f3b-40f6-92d3-9758f42d36ba-proxy-tls\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.155837 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.155818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7ht2\" (UniqueName: \"kubernetes.io/projected/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kube-api-access-c7ht2\") pod \"raw-sklearn-ed1e5-predictor-76464db46d-qpzqt\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.299270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.299174 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:37.426151 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.426126 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt"] Apr 23 13:51:37.427769 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:51:37.427741 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6142f85_8f3b_40f6_92d3_9758f42d36ba.slice/crio-181045a8d337af9ed31be9a24351d70c2af35d5076404c7a84236789151e37ea WatchSource:0}: Error finding container 181045a8d337af9ed31be9a24351d70c2af35d5076404c7a84236789151e37ea: Status 404 returned error can't find the container with id 181045a8d337af9ed31be9a24351d70c2af35d5076404c7a84236789151e37ea Apr 23 13:51:37.499267 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.499243 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-223bfc-predictor-86594bf96-xczs2_08252d44-397b-4cac-bf76-e4f024214bbc/storage-initializer/1.log" Apr 23 13:51:37.499646 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.499628 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-223bfc-predictor-86594bf96-xczs2_08252d44-397b-4cac-bf76-e4f024214bbc/storage-initializer/0.log" Apr 23 13:51:37.499696 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.499672 2569 generic.go:358] "Generic (PLEG): container finished" podID="08252d44-397b-4cac-bf76-e4f024214bbc" containerID="f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a" exitCode=1 Apr 23 13:51:37.499806 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.499784 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" event={"ID":"08252d44-397b-4cac-bf76-e4f024214bbc","Type":"ContainerDied","Data":"f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a"} Apr 23 13:51:37.499889 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.499800 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" Apr 23 13:51:37.499889 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.499824 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2" event={"ID":"08252d44-397b-4cac-bf76-e4f024214bbc","Type":"ContainerDied","Data":"e9abc0e6837af5c542f11dbaf53678696060e3477854af11354b8c1c315cbf39"} Apr 23 13:51:37.499889 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.499849 2569 scope.go:117] "RemoveContainer" containerID="f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a" Apr 23 13:51:37.501311 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.501286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" event={"ID":"f6142f85-8f3b-40f6-92d3-9758f42d36ba","Type":"ContainerStarted","Data":"d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00"} Apr 23 13:51:37.501417 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.501317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" event={"ID":"f6142f85-8f3b-40f6-92d3-9758f42d36ba","Type":"ContainerStarted","Data":"181045a8d337af9ed31be9a24351d70c2af35d5076404c7a84236789151e37ea"} Apr 23 13:51:37.508035 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.507802 2569 scope.go:117] "RemoveContainer" containerID="b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79" Apr 23 13:51:37.515917 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.515901 2569 scope.go:117] "RemoveContainer" containerID="f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a" Apr 23 13:51:37.516221 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:37.516203 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a\": container with ID starting with f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a not found: ID does not exist" containerID="f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a" Apr 23 13:51:37.516270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.516229 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a"} err="failed to get container status \"f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a\": rpc error: code = NotFound desc = could not find container \"f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a\": container with ID starting with f0b5cff33f33cdd5a014ecc51852e30a120e412d128101c2363e459938e5f57a not found: ID does not exist" Apr 23 13:51:37.516270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.516247 2569 scope.go:117] "RemoveContainer" containerID="b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79" Apr 23 13:51:37.516469 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:51:37.516454 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79\": container with ID starting with b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79 not found: ID does not exist" containerID="b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79" Apr 23 13:51:37.516510 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.516473 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79"} err="failed to get container status \"b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79\": rpc error: code = NotFound desc = could not find container \"b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79\": container with ID starting with b953276618645fa4014e05108da67b626d99dbedab5cf380c02f783cbaaadb79 not found: ID does not exist" Apr 23 13:51:37.552599 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.552489 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2"] Apr 23 13:51:37.554260 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:37.554238 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-223bfc-predictor-86594bf96-xczs2"] Apr 23 13:51:38.422132 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:38.422098 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08252d44-397b-4cac-bf76-e4f024214bbc" path="/var/lib/kubelet/pods/08252d44-397b-4cac-bf76-e4f024214bbc/volumes" Apr 23 13:51:41.517332 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:41.517293 2569 generic.go:358] "Generic (PLEG): container finished" podID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerID="d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00" exitCode=0 Apr 23 13:51:41.517720 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:41.517368 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" event={"ID":"f6142f85-8f3b-40f6-92d3-9758f42d36ba","Type":"ContainerDied","Data":"d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00"} Apr 23 13:51:42.523761 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:42.523725 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" event={"ID":"f6142f85-8f3b-40f6-92d3-9758f42d36ba","Type":"ContainerStarted","Data":"d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490"} Apr 23 13:51:42.523761 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:42.523763 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" event={"ID":"f6142f85-8f3b-40f6-92d3-9758f42d36ba","Type":"ContainerStarted","Data":"fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1"} Apr 23 13:51:42.524308 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:42.524103 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:42.524308 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:42.524226 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:42.525642 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:42.525617 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 23 13:51:42.551006 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:42.550957 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podStartSLOduration=6.550942454 podStartE2EDuration="6.550942454s" podCreationTimestamp="2026-04-23 13:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:51:42.542159384 +0000 UTC m=+1156.697081851" watchObservedRunningTime="2026-04-23 13:51:42.550942454 +0000 UTC m=+1156.705865005" Apr 23 13:51:43.527325 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:43.527272 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 23 13:51:48.532595 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:48.532559 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:51:48.533207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:48.533179 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 23 13:51:58.533487 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:51:58.533438 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 23 13:52:08.533085 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:08.533009 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 23 13:52:18.533653 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:18.533608 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 23 13:52:26.994330 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:26.994220 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:52:27.001847 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:26.995756 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:52:27.001847 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:26.998288 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:52:27.001847 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:26.999896 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:52:28.533869 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:28.533821 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 23 13:52:38.533597 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:38.533546 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 23 13:52:48.534713 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:48.534679 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:52:56.464432 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.464381 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt"] Apr 23 13:52:56.464910 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.464830 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" containerID="cri-o://fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1" gracePeriod=30 Apr 23 13:52:56.465009 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.464984 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kube-rbac-proxy" containerID="cri-o://d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490" gracePeriod=30 Apr 23 13:52:56.596751 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.596714 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2"] Apr 23 13:52:56.600847 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.600820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.607383 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.607353 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-b69cc-predictor-serving-cert\"" Apr 23 13:52:56.612688 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.612663 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\"" Apr 23 13:52:56.618387 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.618359 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2"] Apr 23 13:52:56.693257 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.693212 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70a1e324-9b25-4833-8a42-24a835018067-raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.693481 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.693300 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7cxl\" (UniqueName: \"kubernetes.io/projected/70a1e324-9b25-4833-8a42-24a835018067-kube-api-access-p7cxl\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.693481 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.693345 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70a1e324-9b25-4833-8a42-24a835018067-kserve-provision-location\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.693481 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.693373 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a1e324-9b25-4833-8a42-24a835018067-proxy-tls\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.777459 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.777415 2569 generic.go:358] "Generic (PLEG): container finished" podID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerID="d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490" exitCode=2 Apr 23 13:52:56.777638 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.777465 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" event={"ID":"f6142f85-8f3b-40f6-92d3-9758f42d36ba","Type":"ContainerDied","Data":"d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490"} Apr 23 13:52:56.793819 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.793777 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7cxl\" (UniqueName: \"kubernetes.io/projected/70a1e324-9b25-4833-8a42-24a835018067-kube-api-access-p7cxl\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.793940 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.793836 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70a1e324-9b25-4833-8a42-24a835018067-kserve-provision-location\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.793940 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.793866 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a1e324-9b25-4833-8a42-24a835018067-proxy-tls\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.793940 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.793931 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70a1e324-9b25-4833-8a42-24a835018067-raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.794361 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.794333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70a1e324-9b25-4833-8a42-24a835018067-kserve-provision-location\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.794626 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.794604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70a1e324-9b25-4833-8a42-24a835018067-raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.796521 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.796493 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a1e324-9b25-4833-8a42-24a835018067-proxy-tls\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.801966 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.801941 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7cxl\" (UniqueName: \"kubernetes.io/projected/70a1e324-9b25-4833-8a42-24a835018067-kube-api-access-p7cxl\") pod \"raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:56.912511 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:56.912461 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:52:57.048821 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:57.048793 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2"] Apr 23 13:52:57.051682 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:52:57.051647 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70a1e324_9b25_4833_8a42_24a835018067.slice/crio-1d2d6dc623644e244671221f4f7268ca7c7769d5b56c9239f197981273b1cabd WatchSource:0}: Error finding container 1d2d6dc623644e244671221f4f7268ca7c7769d5b56c9239f197981273b1cabd: Status 404 returned error can't find the container with id 1d2d6dc623644e244671221f4f7268ca7c7769d5b56c9239f197981273b1cabd Apr 23 13:52:57.782937 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:57.782898 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" event={"ID":"70a1e324-9b25-4833-8a42-24a835018067","Type":"ContainerStarted","Data":"d90a7bc4800414a7c65fbf2728715e6a7a373b75ff4f853a0d2eea7552979aa1"} Apr 23 13:52:57.782937 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:57.782943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" event={"ID":"70a1e324-9b25-4833-8a42-24a835018067","Type":"ContainerStarted","Data":"1d2d6dc623644e244671221f4f7268ca7c7769d5b56c9239f197981273b1cabd"} Apr 23 13:52:58.528231 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:58.528185 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.44:8643/healthz\": dial tcp 10.134.0.44:8643: connect: connection refused" Apr 23 13:52:58.533659 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:52:58.533627 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 23 13:53:00.794135 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:00.794085 2569 generic.go:358] "Generic (PLEG): container finished" podID="70a1e324-9b25-4833-8a42-24a835018067" containerID="d90a7bc4800414a7c65fbf2728715e6a7a373b75ff4f853a0d2eea7552979aa1" exitCode=0 Apr 23 13:53:00.794534 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:00.794158 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" event={"ID":"70a1e324-9b25-4833-8a42-24a835018067","Type":"ContainerDied","Data":"d90a7bc4800414a7c65fbf2728715e6a7a373b75ff4f853a0d2eea7552979aa1"} Apr 23 13:53:01.313524 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.313497 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:53:01.429922 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.429817 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7ht2\" (UniqueName: \"kubernetes.io/projected/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kube-api-access-c7ht2\") pod \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " Apr 23 13:53:01.429922 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.429874 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6142f85-8f3b-40f6-92d3-9758f42d36ba-proxy-tls\") pod \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " Apr 23 13:53:01.429922 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.429904 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6142f85-8f3b-40f6-92d3-9758f42d36ba-raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\") pod \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " Apr 23 13:53:01.430267 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.429947 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kserve-provision-location\") pod \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\" (UID: \"f6142f85-8f3b-40f6-92d3-9758f42d36ba\") " Apr 23 13:53:01.430338 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.430310 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6142f85-8f3b-40f6-92d3-9758f42d36ba-raw-sklearn-ed1e5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-ed1e5-kube-rbac-proxy-sar-config") pod "f6142f85-8f3b-40f6-92d3-9758f42d36ba" (UID: "f6142f85-8f3b-40f6-92d3-9758f42d36ba"). InnerVolumeSpecName "raw-sklearn-ed1e5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:53:01.430398 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.430347 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f6142f85-8f3b-40f6-92d3-9758f42d36ba" (UID: "f6142f85-8f3b-40f6-92d3-9758f42d36ba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:01.432207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.432184 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6142f85-8f3b-40f6-92d3-9758f42d36ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f6142f85-8f3b-40f6-92d3-9758f42d36ba" (UID: "f6142f85-8f3b-40f6-92d3-9758f42d36ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:53:01.432303 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.432284 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kube-api-access-c7ht2" (OuterVolumeSpecName: "kube-api-access-c7ht2") pod "f6142f85-8f3b-40f6-92d3-9758f42d36ba" (UID: "f6142f85-8f3b-40f6-92d3-9758f42d36ba"). InnerVolumeSpecName "kube-api-access-c7ht2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:53:01.530873 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.530832 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7ht2\" (UniqueName: \"kubernetes.io/projected/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kube-api-access-c7ht2\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:53:01.530873 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.530866 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6142f85-8f3b-40f6-92d3-9758f42d36ba-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:53:01.530873 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.530877 2569 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6142f85-8f3b-40f6-92d3-9758f42d36ba-raw-sklearn-ed1e5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:53:01.531159 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.530887 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6142f85-8f3b-40f6-92d3-9758f42d36ba-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:53:01.799535 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.799499 2569 generic.go:358] "Generic (PLEG): container finished" podID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerID="fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1" exitCode=0 Apr 23 13:53:01.800035 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.799594 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" Apr 23 13:53:01.800035 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.799587 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" event={"ID":"f6142f85-8f3b-40f6-92d3-9758f42d36ba","Type":"ContainerDied","Data":"fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1"} Apr 23 13:53:01.800035 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.799709 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt" event={"ID":"f6142f85-8f3b-40f6-92d3-9758f42d36ba","Type":"ContainerDied","Data":"181045a8d337af9ed31be9a24351d70c2af35d5076404c7a84236789151e37ea"} Apr 23 13:53:01.800035 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.799730 2569 scope.go:117] "RemoveContainer" containerID="d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490" Apr 23 13:53:01.801623 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.801595 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" event={"ID":"70a1e324-9b25-4833-8a42-24a835018067","Type":"ContainerStarted","Data":"9fdc61419785957bd40665a738c3a38d0841ac5752c9968fd2463f701c5352f4"} Apr 23 13:53:01.801738 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.801638 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" event={"ID":"70a1e324-9b25-4833-8a42-24a835018067","Type":"ContainerStarted","Data":"b9360e7c48d9438ca4db481670a06d53098bfc5da52b4dbc866d15bc34c0b395"} Apr 23 13:53:01.801880 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.801863 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:53:01.808929 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.808910 2569 scope.go:117] "RemoveContainer" containerID="fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1" Apr 23 13:53:01.816545 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.816524 2569 scope.go:117] "RemoveContainer" containerID="d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00" Apr 23 13:53:01.829874 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.825171 2569 scope.go:117] "RemoveContainer" containerID="d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490" Apr 23 13:53:01.829874 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.827103 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podStartSLOduration=5.82708611 podStartE2EDuration="5.82708611s" podCreationTimestamp="2026-04-23 13:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:53:01.822619954 +0000 UTC m=+1235.977542420" watchObservedRunningTime="2026-04-23 13:53:01.82708611 +0000 UTC m=+1235.982008578" Apr 23 13:53:01.830221 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:53:01.830196 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490\": container with ID starting with d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490 not found: ID does not exist" containerID="d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490" Apr 23 13:53:01.830402 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.830255 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490"} err="failed to get container status \"d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490\": rpc error: code = NotFound desc = could not find container \"d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490\": container with ID starting with d73416cffa866902c18ff0dcaa670ae1707f361e8969b91d0913e39295a06490 not found: ID does not exist" Apr 23 13:53:01.830449 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.830414 2569 scope.go:117] "RemoveContainer" containerID="fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1" Apr 23 13:53:01.830826 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:53:01.830799 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1\": container with ID starting with fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1 not found: ID does not exist" containerID="fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1" Apr 23 13:53:01.830894 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.830837 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1"} err="failed to get container status \"fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1\": rpc error: code = NotFound desc = could not find container \"fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1\": container with ID starting with fd3528537fe8cf1654e2b5cc93662ae9582cac79209873c1e917113c39d242e1 not found: ID does not exist" Apr 23 13:53:01.830894 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.830859 2569 scope.go:117] "RemoveContainer" containerID="d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00" Apr 23 13:53:01.831128 ip-10-0-136-158 kubenswrapper[2569]: E0423 13:53:01.831106 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00\": container with ID starting with d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00 not found: ID does not exist" containerID="d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00" Apr 23 13:53:01.831192 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.831135 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00"} err="failed to get container status \"d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00\": rpc error: code = NotFound desc = could not find container \"d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00\": container with ID starting with d117df0490fc0f964786f35ac6f056507e6b8dd080c42a7d35ec3f07afcccf00 not found: ID does not exist" Apr 23 13:53:01.849756 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.849718 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt"] Apr 23 13:53:01.871631 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:01.871603 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-ed1e5-predictor-76464db46d-qpzqt"] Apr 23 13:53:02.420872 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:02.420837 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" path="/var/lib/kubelet/pods/f6142f85-8f3b-40f6-92d3-9758f42d36ba/volumes" Apr 23 13:53:02.806875 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:02.806842 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:53:02.808137 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:02.808107 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 23 13:53:03.809590 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:03.809550 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 23 13:53:08.813741 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:08.813708 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:53:08.814207 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:08.814181 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 23 13:53:18.814872 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:18.814822 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 23 13:53:28.814353 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:28.814312 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 23 13:53:38.814410 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:38.814367 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 23 13:53:48.814483 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:48.814427 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 23 13:53:58.818924 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:53:58.818876 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 23 13:54:08.814870 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:08.814835 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:54:16.637803 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:16.637767 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2"] Apr 23 13:54:16.638311 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:16.638261 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" containerID="cri-o://b9360e7c48d9438ca4db481670a06d53098bfc5da52b4dbc866d15bc34c0b395" gracePeriod=30 Apr 23 13:54:16.638311 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:16.638305 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kube-rbac-proxy" containerID="cri-o://9fdc61419785957bd40665a738c3a38d0841ac5752c9968fd2463f701c5352f4" gracePeriod=30 Apr 23 13:54:17.056626 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:17.056594 2569 generic.go:358] "Generic (PLEG): container finished" podID="70a1e324-9b25-4833-8a42-24a835018067" containerID="9fdc61419785957bd40665a738c3a38d0841ac5752c9968fd2463f701c5352f4" exitCode=2 Apr 23 13:54:17.056807 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:17.056677 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" event={"ID":"70a1e324-9b25-4833-8a42-24a835018067","Type":"ContainerDied","Data":"9fdc61419785957bd40665a738c3a38d0841ac5752c9968fd2463f701c5352f4"} Apr 23 13:54:18.810047 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:18.810000 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.45:8643/healthz\": dial tcp 10.134.0.45:8643: connect: connection refused" Apr 23 13:54:18.814347 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:18.814311 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 23 13:54:21.073369 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.073324 2569 generic.go:358] "Generic (PLEG): container finished" podID="70a1e324-9b25-4833-8a42-24a835018067" containerID="b9360e7c48d9438ca4db481670a06d53098bfc5da52b4dbc866d15bc34c0b395" exitCode=0 Apr 23 13:54:21.073755 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.073394 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" event={"ID":"70a1e324-9b25-4833-8a42-24a835018067","Type":"ContainerDied","Data":"b9360e7c48d9438ca4db481670a06d53098bfc5da52b4dbc866d15bc34c0b395"} Apr 23 13:54:21.188975 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.188951 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:54:21.302344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.301541 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70a1e324-9b25-4833-8a42-24a835018067-kserve-provision-location\") pod \"70a1e324-9b25-4833-8a42-24a835018067\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " Apr 23 13:54:21.302344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.301646 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a1e324-9b25-4833-8a42-24a835018067-proxy-tls\") pod \"70a1e324-9b25-4833-8a42-24a835018067\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " Apr 23 13:54:21.302344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.301679 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70a1e324-9b25-4833-8a42-24a835018067-raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\") pod \"70a1e324-9b25-4833-8a42-24a835018067\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " Apr 23 13:54:21.302344 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.301770 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7cxl\" (UniqueName: \"kubernetes.io/projected/70a1e324-9b25-4833-8a42-24a835018067-kube-api-access-p7cxl\") pod \"70a1e324-9b25-4833-8a42-24a835018067\" (UID: \"70a1e324-9b25-4833-8a42-24a835018067\") " Apr 23 13:54:21.305404 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.305203 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a1e324-9b25-4833-8a42-24a835018067-kube-api-access-p7cxl" (OuterVolumeSpecName: "kube-api-access-p7cxl") pod "70a1e324-9b25-4833-8a42-24a835018067" (UID: "70a1e324-9b25-4833-8a42-24a835018067"). InnerVolumeSpecName "kube-api-access-p7cxl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:54:21.306193 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.305642 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a1e324-9b25-4833-8a42-24a835018067-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "70a1e324-9b25-4833-8a42-24a835018067" (UID: "70a1e324-9b25-4833-8a42-24a835018067"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:54:21.307430 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.307380 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70a1e324-9b25-4833-8a42-24a835018067-raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config") pod "70a1e324-9b25-4833-8a42-24a835018067" (UID: "70a1e324-9b25-4833-8a42-24a835018067"). InnerVolumeSpecName "raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:54:21.308285 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.308253 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a1e324-9b25-4833-8a42-24a835018067-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "70a1e324-9b25-4833-8a42-24a835018067" (UID: "70a1e324-9b25-4833-8a42-24a835018067"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:54:21.403419 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.403377 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a1e324-9b25-4833-8a42-24a835018067-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:54:21.403419 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.403411 2569 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70a1e324-9b25-4833-8a42-24a835018067-raw-sklearn-runtime-b69cc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:54:21.403419 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.403425 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7cxl\" (UniqueName: \"kubernetes.io/projected/70a1e324-9b25-4833-8a42-24a835018067-kube-api-access-p7cxl\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:54:21.403662 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:21.403435 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70a1e324-9b25-4833-8a42-24a835018067-kserve-provision-location\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 23 13:54:22.079044 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:22.079008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" event={"ID":"70a1e324-9b25-4833-8a42-24a835018067","Type":"ContainerDied","Data":"1d2d6dc623644e244671221f4f7268ca7c7769d5b56c9239f197981273b1cabd"} Apr 23 13:54:22.079483 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:22.079078 2569 scope.go:117] "RemoveContainer" containerID="9fdc61419785957bd40665a738c3a38d0841ac5752c9968fd2463f701c5352f4" Apr 23 13:54:22.079483 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:22.079113 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2" Apr 23 13:54:22.087922 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:22.087905 2569 scope.go:117] "RemoveContainer" containerID="b9360e7c48d9438ca4db481670a06d53098bfc5da52b4dbc866d15bc34c0b395" Apr 23 13:54:22.095560 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:22.095542 2569 scope.go:117] "RemoveContainer" containerID="d90a7bc4800414a7c65fbf2728715e6a7a373b75ff4f853a0d2eea7552979aa1" Apr 23 13:54:22.101446 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:22.101423 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2"] Apr 23 13:54:22.104022 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:22.104001 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-b69cc-predictor-6c984fd4fb-n4hl2"] Apr 23 13:54:22.421156 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:22.421044 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a1e324-9b25-4833-8a42-24a835018067" path="/var/lib/kubelet/pods/70a1e324-9b25-4833-8a42-24a835018067/volumes" Apr 23 13:54:45.388647 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:45.388569 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kmjjb_f11d0899-4c12-462d-a8c1-ea18032668a9/global-pull-secret-syncer/0.log" Apr 23 13:54:45.499143 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:45.499109 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zjfsg_5af36bbd-1993-4d2e-ac6c-1f12cb3f5fac/konnectivity-agent/0.log" Apr 23 13:54:45.575516 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:45.575484 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-158.ec2.internal_9b4440db6557536c217fdb95da13736d/haproxy/0.log" Apr 23 13:54:48.850123 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:48.850089 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-fbhzv_deda38e5-8a52-4797-a3fa-938eb8704a37/cluster-monitoring-operator/0.log" Apr 23 13:54:48.978258 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:48.978226 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-jfppg_0ad50b34-0c35-490e-a0b5-ad552f3b4cd5/monitoring-plugin/0.log" Apr 23 13:54:49.178775 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:49.178700 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xmqx4_c6aa409f-f25a-46d2-83bc-229d8993033a/node-exporter/0.log" Apr 23 13:54:49.201262 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:49.201228 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xmqx4_c6aa409f-f25a-46d2-83bc-229d8993033a/kube-rbac-proxy/0.log" Apr 23 13:54:49.223428 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:49.223400 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xmqx4_c6aa409f-f25a-46d2-83bc-229d8993033a/init-textfile/0.log" Apr 23 13:54:49.509377 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:49.509352 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-5h8g8_ad5caa19-ed79-4a2f-83b0-55eb874b39f5/prometheus-operator/0.log" Apr 23 13:54:49.534377 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:49.534340 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-5h8g8_ad5caa19-ed79-4a2f-83b0-55eb874b39f5/kube-rbac-proxy/0.log" Apr 23 13:54:49.557801 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:49.557767 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-bbxdj_d85dfdc8-89bf-4595-aee1-344627449373/prometheus-operator-admission-webhook/0.log" Apr 23 13:54:50.927867 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:50.927836 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-ghb6b_3f8dae65-c604-469a-abf3-bc8ac066bcd8/networking-console-plugin/0.log" Apr 23 13:54:51.360754 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:51.360721 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/2.log" Apr 23 13:54:51.364908 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:51.364888 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4666z_2af8933e-b7d0-4a15-a43e-c2a76d750555/console-operator/3.log" Apr 23 13:54:52.151657 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.151630 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-dgfsn_55eb7ea0-9b60-41aa-9e7e-2ccf55ef5388/volume-data-source-validator/0.log" Apr 23 13:54:52.421238 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421164 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9"] Apr 23 13:54:52.421458 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421447 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" Apr 23 13:54:52.421503 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421460 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" Apr 23 13:54:52.421503 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421470 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kube-rbac-proxy" Apr 23 13:54:52.421503 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421477 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kube-rbac-proxy" Apr 23 13:54:52.421503 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421486 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" Apr 23 13:54:52.421503 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421491 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" Apr 23 13:54:52.421503 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421504 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="storage-initializer" Apr 23 13:54:52.421685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421509 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="storage-initializer" Apr 23 13:54:52.421685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421520 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="storage-initializer" Apr 23 13:54:52.421685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421525 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="storage-initializer" Apr 23 13:54:52.421685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421532 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kube-rbac-proxy" Apr 23 13:54:52.421685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421537 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kube-rbac-proxy" Apr 23 13:54:52.421685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421591 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kserve-container" Apr 23 13:54:52.421685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421598 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kserve-container" Apr 23 13:54:52.421685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421605 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="70a1e324-9b25-4833-8a42-24a835018067" containerName="kube-rbac-proxy" Apr 23 13:54:52.421685 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.421612 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6142f85-8f3b-40f6-92d3-9758f42d36ba" containerName="kube-rbac-proxy" Apr 23 13:54:52.424676 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.424661 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.426438 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.426409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-lib-modules\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.426576 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.426459 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-podres\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.426576 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.426511 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-sys\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.426576 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.426528 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-proc\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.426689 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.426585 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6fv4\" (UniqueName: \"kubernetes.io/projected/a2423c32-f3c7-439d-96af-5bad333316e5-kube-api-access-g6fv4\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.427511 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.427311 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sl86\"/\"openshift-service-ca.crt\"" Apr 23 13:54:52.428247 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.428225 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6sl86\"/\"default-dockercfg-dr2cm\"" Apr 23 13:54:52.428357 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.428257 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sl86\"/\"kube-root-ca.crt\"" Apr 23 13:54:52.430018 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.429994 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9"] Apr 23 13:54:52.527648 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.527610 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-lib-modules\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.527834 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.527660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-podres\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.527834 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.527739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-sys\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.527834 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.527764 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-proc\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.527834 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.527789 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-podres\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.527834 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.527792 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-lib-modules\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.528031 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.527830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-sys\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.528031 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.527794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6fv4\" (UniqueName: \"kubernetes.io/projected/a2423c32-f3c7-439d-96af-5bad333316e5-kube-api-access-g6fv4\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.528031 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.527861 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a2423c32-f3c7-439d-96af-5bad333316e5-proc\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.540568 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.540539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6fv4\" (UniqueName: \"kubernetes.io/projected/a2423c32-f3c7-439d-96af-5bad333316e5-kube-api-access-g6fv4\") pod \"perf-node-gather-daemonset-tkrl9\" (UID: \"a2423c32-f3c7-439d-96af-5bad333316e5\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.735935 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.735821 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:52.863408 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.863349 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9"] Apr 23 13:54:52.863564 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.863547 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h8fxx_33d8f26a-427d-4263-9b87-13337ac3a834/dns/0.log" Apr 23 13:54:52.866175 ip-10-0-136-158 kubenswrapper[2569]: W0423 13:54:52.866150 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda2423c32_f3c7_439d_96af_5bad333316e5.slice/crio-7d3334ed7c41d64f28f8cdd0447cb68f4f3e6c456bee0aa75973aa282481f98b WatchSource:0}: Error finding container 7d3334ed7c41d64f28f8cdd0447cb68f4f3e6c456bee0aa75973aa282481f98b: Status 404 returned error can't find the container with id 7d3334ed7c41d64f28f8cdd0447cb68f4f3e6c456bee0aa75973aa282481f98b Apr 23 13:54:52.867811 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.867773 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:54:52.883104 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.883082 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h8fxx_33d8f26a-427d-4263-9b87-13337ac3a834/kube-rbac-proxy/0.log" Apr 23 13:54:52.970422 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:52.970392 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nzrks_61372c64-9070-4751-b720-a4016030cf02/dns-node-resolver/0.log" Apr 23 13:54:53.179978 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:53.179936 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" event={"ID":"a2423c32-f3c7-439d-96af-5bad333316e5","Type":"ContainerStarted","Data":"51b705889401d6d83d2e70a527cd1f7bc67ff7653c30bb632d1dab2302960c3c"} Apr 23 13:54:53.179978 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:53.179982 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" event={"ID":"a2423c32-f3c7-439d-96af-5bad333316e5","Type":"ContainerStarted","Data":"7d3334ed7c41d64f28f8cdd0447cb68f4f3e6c456bee0aa75973aa282481f98b"} Apr 23 13:54:53.180422 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:53.180097 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:54:53.195786 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:53.195730 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" podStartSLOduration=1.195716615 podStartE2EDuration="1.195716615s" podCreationTimestamp="2026-04-23 13:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:53.195147036 +0000 UTC m=+1347.350069514" watchObservedRunningTime="2026-04-23 13:54:53.195716615 +0000 UTC m=+1347.350639082" Apr 23 13:54:53.443646 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:53.443620 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xwbp9_29bbf816-c174-4330-b3f2-ded908db0f6a/node-ca/0.log" Apr 23 13:54:54.091053 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:54.091019 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5c4d58d76d-p9647_c3812d73-c709-4f23-aa36-2623bc03faf0/router/0.log" Apr 23 13:54:54.378439 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:54.378361 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ggnhj_bb994adb-00a6-4601-83e9-80e43ab53049/serve-healthcheck-canary/0.log" Apr 23 13:54:54.771472 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:54.771440 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5kw89_c4f65a5c-dbc0-4b33-825f-41c16ff92077/insights-operator/0.log" Apr 23 13:54:54.771641 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:54.771482 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5kw89_c4f65a5c-dbc0-4b33-825f-41c16ff92077/insights-operator/1.log" Apr 23 13:54:54.791504 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:54.791470 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mp5jl_c5ddcfc3-4c39-49b6-93b5-972a1e87f960/kube-rbac-proxy/0.log" Apr 23 13:54:54.813328 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:54.813300 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mp5jl_c5ddcfc3-4c39-49b6-93b5-972a1e87f960/exporter/0.log" Apr 23 13:54:54.835846 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:54.835816 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mp5jl_c5ddcfc3-4c39-49b6-93b5-972a1e87f960/extractor/0.log" Apr 23 13:54:56.832570 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:56.832536 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6b667fdd66-vc7d7_f3c1dc8a-6c5d-4dd7-8ad7-c3e0a2c5e7dc/manager/0.log" Apr 23 13:54:56.874322 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:56.874294 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-qt4dm_46de531b-554a-4961-893f-3295250ff9f5/server/0.log" Apr 23 13:54:56.975682 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:56.975659 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-tlks5_2b361ebc-24be-497f-b2d7-4dfd4255a3f9/s3-init/0.log" Apr 23 13:54:57.001180 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:57.001149 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-h9v5c_0a6f64c3-9ae9-493c-9566-8192c9595401/seaweedfs/0.log" Apr 23 13:54:59.194356 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:54:59.194324 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-tkrl9" Apr 23 13:55:00.734827 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:00.734791 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-xq8kc_83f8f286-0a1a-4047-8e3d-83c4b68f2209/kube-storage-version-migrator-operator/1.log" Apr 23 13:55:00.735681 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:00.735659 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-xq8kc_83f8f286-0a1a-4047-8e3d-83c4b68f2209/kube-storage-version-migrator-operator/0.log" Apr 23 13:55:02.065786 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.065761 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lkhk_e29be9aa-ef19-4770-b277-bce09909acde/kube-multus-additional-cni-plugins/0.log" Apr 23 13:55:02.086029 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.086005 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lkhk_e29be9aa-ef19-4770-b277-bce09909acde/egress-router-binary-copy/0.log" Apr 23 13:55:02.111610 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.111587 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lkhk_e29be9aa-ef19-4770-b277-bce09909acde/cni-plugins/0.log" Apr 23 13:55:02.133770 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.133737 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lkhk_e29be9aa-ef19-4770-b277-bce09909acde/bond-cni-plugin/0.log" Apr 23 13:55:02.158246 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.158224 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lkhk_e29be9aa-ef19-4770-b277-bce09909acde/routeoverride-cni/0.log" Apr 23 13:55:02.180158 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.180121 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lkhk_e29be9aa-ef19-4770-b277-bce09909acde/whereabouts-cni-bincopy/0.log" Apr 23 13:55:02.204194 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.204169 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6lkhk_e29be9aa-ef19-4770-b277-bce09909acde/whereabouts-cni/0.log" Apr 23 13:55:02.390718 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.390649 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cj68p_fe336864-d139-416a-b1cb-afe14a9db883/kube-multus/0.log" Apr 23 13:55:02.493468 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.493437 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dqcwj_dc7a9b0c-42a9-4562-a03a-27dca913446a/network-metrics-daemon/0.log" Apr 23 13:55:02.511516 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:02.511482 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dqcwj_dc7a9b0c-42a9-4562-a03a-27dca913446a/kube-rbac-proxy/0.log" Apr 23 13:55:03.832690 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:03.832665 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-controller/0.log" Apr 23 13:55:03.850985 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:03.850963 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/0.log" Apr 23 13:55:03.857270 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:03.857244 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovn-acl-logging/1.log" Apr 23 13:55:03.875198 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:03.875158 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/kube-rbac-proxy-node/0.log" Apr 23 13:55:03.897896 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:03.897828 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 13:55:03.916659 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:03.916639 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/northd/0.log" Apr 23 13:55:03.939500 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:03.939480 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/nbdb/0.log" Apr 23 13:55:03.961845 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:03.961827 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/sbdb/0.log" Apr 23 13:55:04.048314 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:04.048281 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6dwj_1237c950-1db9-42f8-be43-fc6424f2ae2c/ovnkube-controller/0.log" Apr 23 13:55:05.056653 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:05.056623 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zw7vm_32951250-c04f-4a66-a62c-e1372b1c84d0/network-check-target-container/0.log" Apr 23 13:55:05.991995 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:05.991967 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-9fc98_baa916b1-56d7-46e4-9ccb-a3794c262e34/iptables-alerter/0.log" Apr 23 13:55:06.603610 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:06.603580 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tttgj_8797a06c-9f6d-4c9f-b8e1-36e99724079b/tuned/0.log" Apr 23 13:55:08.276832 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:08.276800 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-c228x_dd1d91d8-09b0-43ef-9971-5c19edba64a2/cluster-samples-operator/0.log" Apr 23 13:55:08.292585 ip-10-0-136-158 kubenswrapper[2569]: I0423 13:55:08.292562 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-c228x_dd1d91d8-09b0-43ef-9971-5c19edba64a2/cluster-samples-operator-watch/0.log"