Apr 24 19:04:01.044645 ip-10-0-138-52 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 19:04:01.044655 ip-10-0-138-52 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 19:04:01.044662 ip-10-0-138-52 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 19:04:01.044888 ip-10-0-138-52 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 19:04:11.102625 ip-10-0-138-52 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 19:04:11.102642 ip-10-0-138-52 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 269ef23ce7ff4a9e8c43a08e5ce94afc -- Apr 24 19:06:36.765414 ip-10-0-138-52 systemd[1]: Starting Kubernetes Kubelet... Apr 24 19:06:37.206071 ip-10-0-138-52 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:37.206071 ip-10-0-138-52 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 19:06:37.206071 ip-10-0-138-52 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:37.206071 ip-10-0-138-52 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 19:06:37.206071 ip-10-0-138-52 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:37.207389 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.207289 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 19:06:37.212361 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212196 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.212361 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212361 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212368 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212373 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212377 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212382 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212388 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212392 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212395 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212399 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212403 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212407 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212410 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212414 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212417 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212422 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212425 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212430 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212433 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212440 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212447 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.212541 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212451 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212456 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212461 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212466 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212470 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212475 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212479 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212483 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212488 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212491 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212498 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212502 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212507 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212510 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212516 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212521 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212525 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212530 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212534 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212538 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.213368 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212543 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212547 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212551 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212555 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212559 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212564 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212568 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212575 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212581 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212586 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212591 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212595 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212599 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212603 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212607 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212612 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212616 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212620 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212625 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212632 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.214151 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212636 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212640 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212644 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212648 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212652 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212656 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212661 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212666 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212670 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212674 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212679 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212683 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212687 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212691 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212695 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212699 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212704 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212721 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212726 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212730 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.214637 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212734 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212739 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212743 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212747 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.212751 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213448 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213457 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213462 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213466 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213470 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213478 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213483 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213487 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213492 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213496 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213500 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213505 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213509 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213514 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213518 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.215501 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213522 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213526 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213531 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213535 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213539 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213543 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213547 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213552 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213556 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213568 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213573 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213577 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213581 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213585 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213590 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213594 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213601 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213606 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213612 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.216325 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213617 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213621 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213625 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213629 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213633 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213637 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213641 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213646 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213650 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213654 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213658 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213663 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213668 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213672 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213678 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213682 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213687 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213691 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213695 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213699 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.216810 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213703 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213708 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213712 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213723 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213728 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213732 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213737 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213741 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213745 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213749 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213753 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213758 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213762 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213765 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213770 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213774 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213778 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213782 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213786 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213793 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.217421 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213797 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213802 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213805 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213815 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213819 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213823 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213828 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213832 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213836 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213840 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213844 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.213848 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214676 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214691 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214709 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214717 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214732 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214737 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214745 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214752 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214757 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 19:06:37.218057 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214763 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214768 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214774 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214778 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214783 2568 flags.go:64] FLAG: --cgroup-root="" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214788 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214794 2568 flags.go:64] FLAG: --client-ca-file="" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214798 2568 flags.go:64] FLAG: --cloud-config="" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214803 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214807 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214817 2568 flags.go:64] FLAG: --cluster-domain="" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214821 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214826 2568 flags.go:64] FLAG: --config-dir="" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214831 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214840 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214846 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214851 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214856 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214861 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214866 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214871 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214876 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214882 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214887 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214894 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 19:06:37.218692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214899 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214903 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214909 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214922 2568 flags.go:64] FLAG: --enable-server="true" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214927 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214937 2568 flags.go:64] FLAG: --event-burst="100" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214942 2568 flags.go:64] FLAG: --event-qps="50" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214946 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214952 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214957 2568 flags.go:64] FLAG: --eviction-hard="" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214963 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214968 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214973 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214978 2568 flags.go:64] FLAG: --eviction-soft="" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214982 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214987 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214992 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.214997 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215001 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215006 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215011 2568 flags.go:64] FLAG: --feature-gates="" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215019 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215023 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215029 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215034 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215040 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 24 19:06:37.219450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215046 2568 flags.go:64] FLAG: --help="false" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215050 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215055 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215060 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215066 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215072 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215078 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215083 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215088 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215093 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215120 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215127 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215132 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215136 2568 flags.go:64] FLAG: --kube-reserved="" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215142 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215147 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215152 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215156 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215161 2568 flags.go:64] FLAG: --lock-file="" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215166 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215171 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215176 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215191 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 19:06:37.220128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215196 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215201 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215206 2568 flags.go:64] FLAG: --logging-format="text" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215211 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215218 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215223 2568 flags.go:64] FLAG: --manifest-url="" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215228 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215236 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215241 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215248 2568 flags.go:64] FLAG: --max-pods="110" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215252 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215257 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215262 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215267 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215273 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215277 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215282 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215297 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215302 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215306 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215319 2568 flags.go:64] FLAG: --pod-cidr="" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215324 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215339 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215344 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 19:06:37.220742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215349 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215354 2568 flags.go:64] FLAG: --port="10250" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215359 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215364 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ffe2e0f575a185df" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215369 2568 flags.go:64] FLAG: --qos-reserved="" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215374 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215379 2568 flags.go:64] FLAG: --register-node="true" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215384 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215388 2568 flags.go:64] FLAG: --register-with-taints="" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215403 2568 flags.go:64] FLAG: --registry-burst="10" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215408 2568 flags.go:64] FLAG: --registry-qps="5" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215412 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215417 2568 flags.go:64] FLAG: --reserved-memory="" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215425 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215430 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215435 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215440 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215445 2568 flags.go:64] FLAG: --runonce="false" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215449 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215455 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215460 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215465 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215470 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215475 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215480 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215485 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 19:06:37.221358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215489 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215494 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215498 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215511 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215516 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215520 2568 flags.go:64] FLAG: --system-cgroups="" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215525 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215534 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215539 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215544 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215555 2568 flags.go:64] FLAG: --tls-min-version="" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215559 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215565 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215569 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215574 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215579 2568 flags.go:64] FLAG: --v="2" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215587 2568 flags.go:64] FLAG: --version="false" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215593 2568 flags.go:64] FLAG: --vmodule="" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215600 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.215606 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215765 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215771 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215775 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215780 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.221989 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215784 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215788 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215793 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215799 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215803 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215808 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215812 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215817 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215821 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215826 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215830 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215841 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215852 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215857 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215861 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215866 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215870 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215875 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215879 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.222687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215883 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215890 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215895 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215900 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215906 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215911 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215915 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215921 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215926 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215931 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215935 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215939 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215943 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215947 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215952 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215956 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215966 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215970 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215974 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.223206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215978 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215982 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215986 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215990 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215994 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.215999 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216003 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216015 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216019 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216023 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216028 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216033 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216037 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216041 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216046 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216050 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216054 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216058 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216062 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.223687 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216066 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216072 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216079 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216083 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216087 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216092 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216096 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216116 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216121 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216125 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216131 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216135 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216139 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216143 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216147 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216151 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216155 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216159 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216164 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216168 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.224173 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216172 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216185 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216190 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216194 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.216198 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.217025 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.223784 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.223800 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223850 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223855 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223858 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223861 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223866 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223870 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223873 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.224670 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223876 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223879 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223882 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223885 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223887 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223890 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223893 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223896 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223899 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223901 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223904 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223907 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223909 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223912 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223914 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223917 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223919 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223921 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223925 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.225051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223927 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223930 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223933 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223936 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223939 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223942 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223945 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223947 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223950 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223952 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223955 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223957 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223960 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223963 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223965 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223968 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223970 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223973 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223975 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223978 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.225544 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223980 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223983 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223985 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223988 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223990 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223993 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223995 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.223998 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224000 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224003 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224005 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224008 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224011 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224014 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224017 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224020 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224024 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224028 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224031 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224033 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.226082 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224036 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224038 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224041 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224044 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224046 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224049 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224051 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224053 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224056 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224059 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224061 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224063 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224066 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224068 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224071 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224073 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224076 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224078 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224081 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.226612 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224083 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.224088 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224197 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224202 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224205 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224208 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224212 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224216 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224219 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224221 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224224 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224227 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224230 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224232 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224235 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224237 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.227083 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224240 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224242 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224245 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224247 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224250 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224253 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224255 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224257 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224260 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224263 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224266 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224268 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224271 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224275 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224279 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224282 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224285 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224288 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224291 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.227520 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224293 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224296 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224299 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224302 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224306 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224309 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224312 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224315 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224317 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224320 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224322 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224326 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224329 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224331 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224334 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224337 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224340 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224343 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224346 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224348 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.228053 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224351 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224353 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224356 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224359 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224362 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224365 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224368 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224370 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224373 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224375 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224378 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224380 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224383 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224385 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224388 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224390 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224393 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224397 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224399 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.228576 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224402 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224404 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224407 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224409 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224412 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224414 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224417 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224419 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224422 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224424 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224427 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224430 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224432 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:37.224435 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.224455 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:37.229046 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.225129 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 19:06:37.229479 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.228615 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 19:06:37.229656 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.229642 2568 server.go:1019] "Starting client certificate rotation" Apr 24 19:06:37.229760 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.229741 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:37.229796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.229780 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:37.254635 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.254611 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:37.256817 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.256789 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:37.275814 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.275795 2568 log.go:25] "Validated CRI v1 runtime API" Apr 24 19:06:37.284238 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.284213 2568 log.go:25] "Validated CRI v1 image API" Apr 24 19:06:37.285384 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.285357 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 19:06:37.289392 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.289367 2568 fs.go:135] Filesystem UUIDs: map[499aa3e8-ab7f-43be-9ca3-e03e3f38d202:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fb1bd9fa-da9d-42b3-9a77-0363da9b62e6:/dev/nvme0n1p3] Apr 24 19:06:37.289392 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.289390 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 19:06:37.289954 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.289932 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:37.295220 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.295095 2568 manager.go:217] Machine: {Timestamp:2026-04-24 19:06:37.29334794 +0000 UTC m=+0.405184277 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100081 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ed15aab7703d300d49273f3235cd9 SystemUUID:ec2ed15a-ab77-03d3-00d4-9273f3235cd9 BootID:269ef23c-e7ff-4a9e-8c43-a08e5ce94afc Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:71:9f:37:83:4d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:71:9f:37:83:4d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:39:8e:43:cc:d9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 19:06:37.295220 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.295211 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 19:06:37.295357 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.295296 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 19:06:37.296532 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.296507 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 19:06:37.296675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.296535 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-52.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 19:06:37.296725 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.296685 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 19:06:37.296725 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.296692 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 19:06:37.296725 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.296705 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:37.298495 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.298484 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:37.299662 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.299650 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:37.299770 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.299761 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 19:06:37.302134 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.302123 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 24 19:06:37.302180 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.302139 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 19:06:37.302180 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.302155 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 19:06:37.302180 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.302165 2568 kubelet.go:397] "Adding apiserver pod source" Apr 24 19:06:37.302180 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.302176 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 19:06:37.303304 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.303290 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:37.303370 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.303318 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:37.306334 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.306306 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 19:06:37.308175 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.308161 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 19:06:37.309400 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309389 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309406 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309412 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309418 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309424 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309430 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309437 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309442 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309450 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309456 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 19:06:37.309459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309464 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 19:06:37.309727 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.309473 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 19:06:37.311302 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.311280 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 19:06:37.311302 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.311303 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 19:06:37.313771 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.313730 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 19:06:37.313989 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.313956 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-52.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 19:06:37.315227 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.315214 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 19:06:37.315293 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.315274 2568 server.go:1295] "Started kubelet" Apr 24 19:06:37.315382 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.315357 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 19:06:37.315435 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.315396 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 19:06:37.315514 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.315502 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 19:06:37.315855 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.315831 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-52.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 19:06:37.316230 ip-10-0-138-52 systemd[1]: Started Kubernetes Kubelet. Apr 24 19:06:37.317461 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.317450 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 24 19:06:37.319440 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.319411 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 19:06:37.324048 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.324032 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 19:06:37.324165 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.324080 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:37.324721 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.324706 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 19:06:37.324721 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.324720 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 19:06:37.324885 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.324805 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 19:06:37.324885 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.324855 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 24 19:06:37.324885 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.324864 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 24 19:06:37.325151 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.325135 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:37.325255 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.324052 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-52.ec2.internal.18a9607831621e4d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-52.ec2.internal,UID:ip-10-0-138-52.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-52.ec2.internal,},FirstTimestamp:2026-04-24 19:06:37.315227213 +0000 UTC m=+0.427063550,LastTimestamp:2026-04-24 19:06:37.315227213 +0000 UTC m=+0.427063550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-52.ec2.internal,}" Apr 24 19:06:37.327386 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.327354 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 19:06:37.327386 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.327386 2568 factory.go:55] Registering systemd factory Apr 24 19:06:37.327386 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.327398 2568 factory.go:223] Registration of the systemd container factory successfully Apr 24 19:06:37.327709 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.327686 2568 factory.go:153] Registering CRI-O factory Apr 24 19:06:37.328300 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.328281 2568 factory.go:223] Registration of the crio container factory successfully Apr 24 19:06:37.328385 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.328321 2568 factory.go:103] Registering Raw factory Apr 24 19:06:37.328385 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.328338 2568 manager.go:1196] Started watching for new ooms in manager Apr 24 19:06:37.328790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.328771 2568 manager.go:319] Starting recovery of all containers Apr 24 19:06:37.328924 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.328906 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 19:06:37.333091 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.333058 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 19:06:37.333224 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.333200 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-52.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 19:06:37.340198 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.340015 2568 manager.go:324] Recovery completed Apr 24 19:06:37.341509 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.341483 2568 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 19:06:37.342537 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.342519 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8xgxh" Apr 24 19:06:37.344429 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.344416 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:37.346914 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.346900 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:37.346972 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.346928 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:37.346972 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.346941 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:37.347464 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.347451 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 19:06:37.347544 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.347464 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 19:06:37.347544 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.347483 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:37.349256 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.349168 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-52.ec2.internal.18a9607833459e1b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-52.ec2.internal,UID:ip-10-0-138-52.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-52.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-52.ec2.internal,},FirstTimestamp:2026-04-24 19:06:37.346913819 +0000 UTC m=+0.458750156,LastTimestamp:2026-04-24 19:06:37.346913819 +0000 UTC m=+0.458750156,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-52.ec2.internal,}" Apr 24 19:06:37.349842 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.349828 2568 policy_none.go:49] "None policy: Start" Apr 24 19:06:37.349899 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.349848 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 19:06:37.349899 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.349862 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 24 19:06:37.350258 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.350243 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8xgxh" Apr 24 19:06:37.405893 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.405872 2568 manager.go:341] "Starting Device Plugin manager" Apr 24 19:06:37.408550 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.405918 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 19:06:37.408550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.405931 2568 server.go:85] "Starting device plugin registration server" Apr 24 19:06:37.408550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.406197 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 19:06:37.408550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.406207 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 19:06:37.408550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.406341 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 19:06:37.408550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.406408 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 19:06:37.408550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.406415 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 19:06:37.408550 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.408486 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 19:06:37.408550 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.408522 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:37.452648 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.452606 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 19:06:37.453864 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.453849 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 19:06:37.453949 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.453878 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 19:06:37.453949 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.453902 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 19:06:37.453949 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.453911 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 19:06:37.454079 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.453955 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 19:06:37.457971 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.457912 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:37.506399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.506357 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:37.507557 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.507542 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:37.507627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.507577 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:37.507627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.507592 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:37.507627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.507618 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.517052 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.517028 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.517138 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.517058 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-52.ec2.internal\": node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:37.532221 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.532194 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:37.554961 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.554930 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal"] Apr 24 19:06:37.555118 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.555011 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:37.555954 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.555940 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:37.556055 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.555967 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:37.556055 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.555977 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:37.558290 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.558273 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:37.558445 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.558428 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.558505 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.558466 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:37.558992 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.558976 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:37.558992 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.558986 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:37.559143 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.559007 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:37.559143 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.559009 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:37.559143 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.559018 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:37.559143 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.559024 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:37.561319 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.561299 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.561426 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.561330 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:37.562076 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.562061 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:37.562169 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.562087 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:37.562169 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.562113 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:37.578782 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.578759 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-52.ec2.internal\" not found" node="ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.584551 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.584528 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-52.ec2.internal\" not found" node="ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.626759 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.626725 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2457aa0bf4a936b2d49c728267a6998-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal\" (UID: \"d2457aa0bf4a936b2d49c728267a6998\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.626759 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.626760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a2e4dc57eaf3ecec9842fb4e7d99fd2d-config\") pod \"kube-apiserver-proxy-ip-10-0-138-52.ec2.internal\" (UID: \"a2e4dc57eaf3ecec9842fb4e7d99fd2d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.626929 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.626778 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d2457aa0bf4a936b2d49c728267a6998-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal\" (UID: \"d2457aa0bf4a936b2d49c728267a6998\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.632958 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.632936 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:37.727681 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.727614 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d2457aa0bf4a936b2d49c728267a6998-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal\" (UID: \"d2457aa0bf4a936b2d49c728267a6998\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.727681 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.727647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2457aa0bf4a936b2d49c728267a6998-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal\" (UID: \"d2457aa0bf4a936b2d49c728267a6998\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.727681 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.727667 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a2e4dc57eaf3ecec9842fb4e7d99fd2d-config\") pod \"kube-apiserver-proxy-ip-10-0-138-52.ec2.internal\" (UID: \"a2e4dc57eaf3ecec9842fb4e7d99fd2d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.727819 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.727720 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a2e4dc57eaf3ecec9842fb4e7d99fd2d-config\") pod \"kube-apiserver-proxy-ip-10-0-138-52.ec2.internal\" (UID: \"a2e4dc57eaf3ecec9842fb4e7d99fd2d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.727819 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.727721 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2457aa0bf4a936b2d49c728267a6998-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal\" (UID: \"d2457aa0bf4a936b2d49c728267a6998\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.727819 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.727727 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d2457aa0bf4a936b2d49c728267a6998-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal\" (UID: \"d2457aa0bf4a936b2d49c728267a6998\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.733703 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.733678 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:37.834548 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.834508 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:37.880710 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.880685 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.886190 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:37.886172 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal" Apr 24 19:06:37.935033 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:37.935003 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:38.035560 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:38.035485 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:38.136135 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:38.136086 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:38.198268 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.198235 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:38.229232 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.229195 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 19:06:38.229822 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.229340 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:06:38.229822 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.229388 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:06:38.236433 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:38.236407 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:38.324827 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.324804 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:38.336647 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:38.336621 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:38.341242 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.341210 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:38.352367 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.352322 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 19:01:37 +0000 UTC" deadline="2027-12-21 19:39:53.187939514 +0000 UTC" Apr 24 19:06:38.352367 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.352363 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14544h33m14.835580126s" Apr 24 19:06:38.365458 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.365431 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ql97t" Apr 24 19:06:38.375435 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.375404 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ql97t" Apr 24 19:06:38.437616 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:38.437587 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:38.457039 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:38.457005 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2e4dc57eaf3ecec9842fb4e7d99fd2d.slice/crio-67bbb7686c318bb649f7fe0c0d650e2570dd1aeb95d95bdd62323f0ce1114e48 WatchSource:0}: Error finding container 67bbb7686c318bb649f7fe0c0d650e2570dd1aeb95d95bdd62323f0ce1114e48: Status 404 returned error can't find the container with id 67bbb7686c318bb649f7fe0c0d650e2570dd1aeb95d95bdd62323f0ce1114e48 Apr 24 19:06:38.457442 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:38.457417 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2457aa0bf4a936b2d49c728267a6998.slice/crio-1b74b8e4493e0ea49b2131b734bf23a272b77e507da3bc5f8d0f39b3ee9d5f7d WatchSource:0}: Error finding container 1b74b8e4493e0ea49b2131b734bf23a272b77e507da3bc5f8d0f39b3ee9d5f7d: Status 404 returned error can't find the container with id 1b74b8e4493e0ea49b2131b734bf23a272b77e507da3bc5f8d0f39b3ee9d5f7d Apr 24 19:06:38.463358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.463341 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:06:38.508278 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.508247 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:38.538003 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:38.537972 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-52.ec2.internal\" not found" Apr 24 19:06:38.575012 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.574921 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:38.624769 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.624738 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" Apr 24 19:06:38.655367 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.655334 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:38.656352 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.656339 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal" Apr 24 19:06:38.669976 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:38.669955 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:39.303151 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.303120 2568 apiserver.go:52] "Watching apiserver" Apr 24 19:06:39.311017 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.310989 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 19:06:39.311394 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.311372 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-x2h2w","kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv","openshift-image-registry/node-ca-29rc7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal","openshift-ovn-kubernetes/ovnkube-node-2thj7","kube-system/global-pull-secret-syncer-dns5q","kube-system/konnectivity-agent-67l2j","openshift-cluster-node-tuning-operator/tuned-9vthc","openshift-multus/multus-additional-cni-plugins-gbmct","openshift-multus/multus-w4nmn","openshift-multus/network-metrics-daemon-nghhh","openshift-network-diagnostics/network-check-target-nlzd4"] Apr 24 19:06:39.314539 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.314518 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.316677 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.316655 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.318728 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.318711 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.320052 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.320036 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.320164 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.320055 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 19:06:39.320164 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.320036 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 19:06:39.320991 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.320973 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.323296 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.323280 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:39.323390 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.323367 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 19:06:39.325599 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.325580 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.327913 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kpm9h\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328131 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328277 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328344 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328283 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328460 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328468 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-62ck5\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328546 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328629 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328672 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nzf4q\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328704 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328771 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.328814 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q9lm4\"" Apr 24 19:06:39.329796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.329623 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.330593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.329813 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 19:06:39.330593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.329828 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 19:06:39.330593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.329937 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.330593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.330089 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.330593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.330201 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 19:06:39.330868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.330757 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lf2vs\"" Apr 24 19:06:39.330960 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.330938 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.331041 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.331025 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 19:06:39.331419 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.331399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.331594 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.331575 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-clc5q\"" Apr 24 19:06:39.333697 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.333672 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.333854 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.333829 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:39.333929 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.333904 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:39.335691 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.335664 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bc00a2b3-d877-4828-bc79-de040ea70887-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.335790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.335705 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8jb6\" (UniqueName: \"kubernetes.io/projected/0b140811-2690-4060-a32d-14cb088e3605-kube-api-access-d8jb6\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.335790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.335756 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-var-lib-kubelet\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.335905 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.335802 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc00a2b3-d877-4828-bc79-de040ea70887-cni-binary-copy\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.335905 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.335847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-sys-fs\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.335905 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.335879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/254ea4ca-f9d7-452a-9868-bdf3ef96512c-host\") pod \"node-ca-29rc7\" (UID: \"254ea4ca-f9d7-452a-9868-bdf3ef96512c\") " pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.336042 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.335903 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-device-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.336042 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.335944 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a96d8dd3-8216-48b7-a304-75026c92aa95-agent-certs\") pod \"konnectivity-agent-67l2j\" (UID: \"a96d8dd3-8216-48b7-a304-75026c92aa95\") " pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:39.336042 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.335976 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-sys\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.336042 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336019 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bc00a2b3-d877-4828-bc79-de040ea70887-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.336253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336053 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-run-ovn\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.336253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-ovnkube-script-lib\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.336253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjww5\" (UniqueName: \"kubernetes.io/projected/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-kube-api-access-xjww5\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.336253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336135 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:39.336253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336156 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-sysctl-conf\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.336253 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.336207 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:39.336253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336198 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-socket-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.336253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336249 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-registration-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336276 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-systemd-units\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336282 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336298 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-run-systemd\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336335 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-sysconfig\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336371 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-lib-modules\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336396 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lk8x\" (UniqueName: \"kubernetes.io/projected/bc00a2b3-d877-4828-bc79-de040ea70887-kube-api-access-2lk8x\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336433 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336475 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-run-netns\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336508 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-etc-openvswitch\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/254ea4ca-f9d7-452a-9868-bdf3ef96512c-serviceca\") pod \"node-ca-29rc7\" (UID: \"254ea4ca-f9d7-452a-9868-bdf3ef96512c\") " pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336558 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-cni-netd\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.336593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336582 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-ovn-node-metrics-cert\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336607 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-tmp\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336651 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-system-cni-dir\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336689 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-os-release\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336719 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-node-log\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336743 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a96d8dd3-8216-48b7-a304-75026c92aa95-konnectivity-ca\") pod \"konnectivity-agent-67l2j\" (UID: \"a96d8dd3-8216-48b7-a304-75026c92aa95\") " pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336766 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-modprobe-d\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-tuned\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-cnibin\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336884 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4rd\" (UniqueName: \"kubernetes.io/projected/254ea4ca-f9d7-452a-9868-bdf3ef96512c-kube-api-access-9p4rd\") pod \"node-ca-29rc7\" (UID: \"254ea4ca-f9d7-452a-9868-bdf3ef96512c\") " pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336909 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-var-lib-openvswitch\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336932 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-run-openvswitch\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336956 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-cni-bin\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.336981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-slash\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337004 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-ovnkube-config\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337039 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-env-overrides\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337067 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-run\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337090 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-log-socket\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337128 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-kubernetes\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-host\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337159 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qcvf\" (UniqueName: \"kubernetes.io/projected/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-kube-api-access-9qcvf\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337199 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-kubelet\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337242 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-sysctl-d\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337267 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-etc-selinux\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337290 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337315 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.337816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.337400 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-systemd\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.338444 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.338378 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:39.338444 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.338431 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:39.339111 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.339080 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 19:06:39.339608 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.339586 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 19:06:39.339707 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.339589 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.339707 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.339612 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pvkkf\"" Apr 24 19:06:39.339870 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.339830 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-2kjfz\"" Apr 24 19:06:39.376869 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.376819 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:38 +0000 UTC" deadline="2027-10-31 23:13:40.512418037 +0000 UTC" Apr 24 19:06:39.376869 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.376868 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13324h7m1.135554211s" Apr 24 19:06:39.426560 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.426532 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 19:06:39.437783 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.437745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-var-lib-kubelet\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.437967 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.437790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-run-multus-certs\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.437967 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.437845 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.437967 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.437873 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-kubelet\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.437967 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.437897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-sysctl-d\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.437967 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.437925 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-os-release\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.437967 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.437951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-etc-selinux\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.438281 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.437980 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-kubelet\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.438344 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.438140 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.438534 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.438514 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.440923 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.440893 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.441484 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441331 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-sysctl-d\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.441484 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441328 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.441484 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441350 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.441484 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441373 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-etc-selinux\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.441484 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-systemd\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.441484 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441454 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-socket-dir-parent\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.441484 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441472 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-etc-kubernetes\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.441813 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441493 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-944x9\" (UniqueName: \"kubernetes.io/projected/76172245-47dc-4f2f-90c9-d345a816e233-kube-api-access-944x9\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:39.441813 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441515 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-systemd\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.441813 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441543 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-kubelet-config\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:39.441813 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441579 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bc00a2b3-d877-4828-bc79-de040ea70887-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.441813 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441671 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8jb6\" (UniqueName: \"kubernetes.io/projected/0b140811-2690-4060-a32d-14cb088e3605-kube-api-access-d8jb6\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.441813 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-var-lib-kubelet\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.441813 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441765 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-var-lib-kubelet\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441849 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441882 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc00a2b3-d877-4828-bc79-de040ea70887-cni-binary-copy\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441900 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-sys-fs\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441924 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/254ea4ca-f9d7-452a-9868-bdf3ef96512c-host\") pod \"node-ca-29rc7\" (UID: \"254ea4ca-f9d7-452a-9868-bdf3ef96512c\") " pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441950 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c668390-7023-447b-92b4-e95d0c65f6cd-host-slash\") pod \"iptables-alerter-x2h2w\" (UID: \"4c668390-7023-447b-92b4-e95d0c65f6cd\") " pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441966 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-sys-fs\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.441976 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-device-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442001 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a96d8dd3-8216-48b7-a304-75026c92aa95-agent-certs\") pod \"konnectivity-agent-67l2j\" (UID: \"a96d8dd3-8216-48b7-a304-75026c92aa95\") " pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442024 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-sys\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-device-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442037 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/254ea4ca-f9d7-452a-9868-bdf3ef96512c-host\") pod \"node-ca-29rc7\" (UID: \"254ea4ca-f9d7-452a-9868-bdf3ef96512c\") " pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-run-k8s-cni-cncf-io\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442076 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442121 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bc00a2b3-d877-4828-bc79-de040ea70887-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442136 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-sys\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442149 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-run-ovn\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.442203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442189 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-run-ovn\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442188 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-ovnkube-script-lib\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442220 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjww5\" (UniqueName: \"kubernetes.io/projected/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-kube-api-access-xjww5\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442244 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-sysctl-conf\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442257 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bc00a2b3-d877-4828-bc79-de040ea70887-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442266 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-socket-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442292 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-registration-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442318 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-systemd-units\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442343 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-run-systemd\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-sysconfig\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442389 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-lib-modules\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442407 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-sysctl-conf\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442465 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc00a2b3-d877-4828-bc79-de040ea70887-cni-binary-copy\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442498 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-run-systemd\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442530 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-socket-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442535 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-sysconfig\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.442922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-registration-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442417 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2dhx\" (UniqueName: \"kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx\") pod \"network-check-target-nlzd4\" (UID: \"22f62d88-7d18-4fc4-a8b1-44efd0814325\") " pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442592 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-systemd-units\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lk8x\" (UniqueName: \"kubernetes.io/projected/bc00a2b3-d877-4828-bc79-de040ea70887-kube-api-access-2lk8x\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442639 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-lib-modules\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-run-netns\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-etc-openvswitch\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442697 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b140811-2690-4060-a32d-14cb088e3605-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442717 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqvjp\" (UniqueName: \"kubernetes.io/projected/4c668390-7023-447b-92b4-e95d0c65f6cd-kube-api-access-xqvjp\") pod \"iptables-alerter-x2h2w\" (UID: \"4c668390-7023-447b-92b4-e95d0c65f6cd\") " pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442723 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-ovnkube-script-lib\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442729 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bc00a2b3-d877-4828-bc79-de040ea70887-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-run-netns\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442747 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9ncj\" (UniqueName: \"kubernetes.io/projected/65e67b5d-323b-47d6-ac53-b3da03f832e6-kube-api-access-p9ncj\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442772 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-etc-openvswitch\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442790 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/254ea4ca-f9d7-452a-9868-bdf3ef96512c-serviceca\") pod \"node-ca-29rc7\" (UID: \"254ea4ca-f9d7-452a-9868-bdf3ef96512c\") " pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442812 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-cni-netd\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.443632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442851 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-ovn-node-metrics-cert\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-tmp\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-system-cni-dir\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442924 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-os-release\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442949 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-node-log\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442973 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a96d8dd3-8216-48b7-a304-75026c92aa95-konnectivity-ca\") pod \"konnectivity-agent-67l2j\" (UID: \"a96d8dd3-8216-48b7-a304-75026c92aa95\") " pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442998 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-modprobe-d\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.442925 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-cni-netd\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443020 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-tuned\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443046 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c668390-7023-447b-92b4-e95d0c65f6cd-iptables-alerter-script\") pod \"iptables-alerter-x2h2w\" (UID: \"4c668390-7023-447b-92b4-e95d0c65f6cd\") " pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443071 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-var-lib-cni-multus\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443080 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-system-cni-dir\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443111 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-cnibin\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443137 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4rd\" (UniqueName: \"kubernetes.io/projected/254ea4ca-f9d7-452a-9868-bdf3ef96512c-kube-api-access-9p4rd\") pod \"node-ca-29rc7\" (UID: \"254ea4ca-f9d7-452a-9868-bdf3ef96512c\") " pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443168 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-os-release\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443186 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-var-lib-openvswitch\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443215 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-run-openvswitch\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.444409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443241 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-cni-bin\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443286 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc00a2b3-d877-4828-bc79-de040ea70887-cnibin\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443292 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-system-cni-dir\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/65e67b5d-323b-47d6-ac53-b3da03f832e6-cni-binary-copy\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/254ea4ca-f9d7-452a-9868-bdf3ef96512c-serviceca\") pod \"node-ca-29rc7\" (UID: \"254ea4ca-f9d7-452a-9868-bdf3ef96512c\") " pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443400 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-modprobe-d\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443490 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-run-openvswitch\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443512 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-node-log\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443530 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-var-lib-openvswitch\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-cni-bin\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443538 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-var-lib-cni-bin\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443576 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-hostroot\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-conf-dir\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443677 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-cnibin\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443695 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a96d8dd3-8216-48b7-a304-75026c92aa95-konnectivity-ca\") pod \"konnectivity-agent-67l2j\" (UID: \"a96d8dd3-8216-48b7-a304-75026c92aa95\") " pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443702 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-run-netns\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443750 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-daemon-config\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443777 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-dbus\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:39.445202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443818 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-slash\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-ovnkube-config\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443868 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-env-overrides\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443890 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-run\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443901 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-host-slash\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443915 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-log-socket\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443941 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-kubernetes\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-host\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.443986 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qcvf\" (UniqueName: \"kubernetes.io/projected/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-kube-api-access-9qcvf\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.444010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-cni-dir\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.444128 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-run\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.444156 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-kubernetes\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.444183 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-host\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.444201 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-log-socket\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.444640 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-ovnkube-config\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.445951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.444790 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-env-overrides\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.446548 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.446041 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-tmp\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.446548 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.446095 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-etc-tuned\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.446548 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.446343 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-ovn-node-metrics-cert\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.446548 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.446437 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a96d8dd3-8216-48b7-a304-75026c92aa95-agent-certs\") pod \"konnectivity-agent-67l2j\" (UID: \"a96d8dd3-8216-48b7-a304-75026c92aa95\") " pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:39.451753 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.451725 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lk8x\" (UniqueName: \"kubernetes.io/projected/bc00a2b3-d877-4828-bc79-de040ea70887-kube-api-access-2lk8x\") pod \"multus-additional-cni-plugins-gbmct\" (UID: \"bc00a2b3-d877-4828-bc79-de040ea70887\") " pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.451935 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.451910 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8jb6\" (UniqueName: \"kubernetes.io/projected/0b140811-2690-4060-a32d-14cb088e3605-kube-api-access-d8jb6\") pod \"aws-ebs-csi-driver-node-gkmfv\" (UID: \"0b140811-2690-4060-a32d-14cb088e3605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.452782 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.452755 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4rd\" (UniqueName: \"kubernetes.io/projected/254ea4ca-f9d7-452a-9868-bdf3ef96512c-kube-api-access-9p4rd\") pod \"node-ca-29rc7\" (UID: \"254ea4ca-f9d7-452a-9868-bdf3ef96512c\") " pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.452895 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.452796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjww5\" (UniqueName: \"kubernetes.io/projected/4a6d24c7-d9ec-4b20-98cd-af5850b0074f-kube-api-access-xjww5\") pod \"ovnkube-node-2thj7\" (UID: \"4a6d24c7-d9ec-4b20-98cd-af5850b0074f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.453502 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.453465 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qcvf\" (UniqueName: \"kubernetes.io/projected/2cf109ce-0195-4971-a0e3-86ad92c8ed1f-kube-api-access-9qcvf\") pod \"tuned-9vthc\" (UID: \"2cf109ce-0195-4971-a0e3-86ad92c8ed1f\") " pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.458196 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.458140 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal" event={"ID":"a2e4dc57eaf3ecec9842fb4e7d99fd2d","Type":"ContainerStarted","Data":"67bbb7686c318bb649f7fe0c0d650e2570dd1aeb95d95bdd62323f0ce1114e48"} Apr 24 19:06:39.459346 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.459307 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" event={"ID":"d2457aa0bf4a936b2d49c728267a6998","Type":"ContainerStarted","Data":"1b74b8e4493e0ea49b2131b734bf23a272b77e507da3bc5f8d0f39b3ee9d5f7d"} Apr 24 19:06:39.545365 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545327 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-socket-dir-parent\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.545365 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-etc-kubernetes\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545395 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-944x9\" (UniqueName: \"kubernetes.io/projected/76172245-47dc-4f2f-90c9-d345a816e233-kube-api-access-944x9\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545419 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-kubelet-config\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545452 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-socket-dir-parent\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545471 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c668390-7023-447b-92b4-e95d0c65f6cd-host-slash\") pod \"iptables-alerter-x2h2w\" (UID: \"4c668390-7023-447b-92b4-e95d0c65f6cd\") " pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-run-k8s-cni-cncf-io\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545508 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-kubelet-config\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dhx\" (UniqueName: \"kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx\") pod \"network-check-target-nlzd4\" (UID: \"22f62d88-7d18-4fc4-a8b1-44efd0814325\") " pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c668390-7023-447b-92b4-e95d0c65f6cd-host-slash\") pod \"iptables-alerter-x2h2w\" (UID: \"4c668390-7023-447b-92b4-e95d0c65f6cd\") " pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqvjp\" (UniqueName: \"kubernetes.io/projected/4c668390-7023-447b-92b4-e95d0c65f6cd-kube-api-access-xqvjp\") pod \"iptables-alerter-x2h2w\" (UID: \"4c668390-7023-447b-92b4-e95d0c65f6cd\") " pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.545597 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.545594 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9ncj\" (UniqueName: \"kubernetes.io/projected/65e67b5d-323b-47d6-ac53-b3da03f832e6-kube-api-access-p9ncj\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.545676 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.545696 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret podName:e583cac8-fcbc-4fa3-a2f4-d8b1fad99146 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:40.045647608 +0000 UTC m=+3.157483939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret") pod "global-pull-secret-syncer-dns5q" (UID: "e583cac8-fcbc-4fa3-a2f4-d8b1fad99146") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.545721 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs podName:76172245-47dc-4f2f-90c9-d345a816e233 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:40.045708295 +0000 UTC m=+3.157544629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs") pod "network-metrics-daemon-nghhh" (UID: "76172245-47dc-4f2f-90c9-d345a816e233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545602 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-etc-kubernetes\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545756 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c668390-7023-447b-92b4-e95d0c65f6cd-iptables-alerter-script\") pod \"iptables-alerter-x2h2w\" (UID: \"4c668390-7023-447b-92b4-e95d0c65f6cd\") " pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-var-lib-cni-multus\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545812 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-system-cni-dir\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545811 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-run-k8s-cni-cncf-io\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545838 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/65e67b5d-323b-47d6-ac53-b3da03f832e6-cni-binary-copy\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545849 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-var-lib-cni-multus\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545871 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-var-lib-cni-bin\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-hostroot\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545902 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-conf-dir\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545919 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-cnibin\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545933 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-run-netns\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545948 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-daemon-config\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-dbus\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.545990 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-cni-dir\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546030 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-run-netns\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546034 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-conf-dir\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546051 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-system-cni-dir\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546079 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-var-lib-cni-bin\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546090 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-var-lib-kubelet\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546091 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-cnibin\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-run-multus-certs\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546159 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-os-release\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546177 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-cni-dir\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546205 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-dbus\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546244 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-os-release\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-hostroot\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546273 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-run-multus-certs\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546290 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/65e67b5d-323b-47d6-ac53-b3da03f832e6-host-var-lib-kubelet\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546462 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/65e67b5d-323b-47d6-ac53-b3da03f832e6-cni-binary-copy\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.546881 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/65e67b5d-323b-47d6-ac53-b3da03f832e6-multus-daemon-config\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.547759 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.546594 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c668390-7023-447b-92b4-e95d0c65f6cd-iptables-alerter-script\") pod \"iptables-alerter-x2h2w\" (UID: \"4c668390-7023-447b-92b4-e95d0c65f6cd\") " pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.552858 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.552834 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:39.554842 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.554788 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:39.554842 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.554813 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:39.554842 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.554827 2568 projected.go:194] Error preparing data for projected volume kube-api-access-x2dhx for pod openshift-network-diagnostics/network-check-target-nlzd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:39.555202 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:39.554941 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx podName:22f62d88-7d18-4fc4-a8b1-44efd0814325 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:40.05492028 +0000 UTC m=+3.166756604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x2dhx" (UniqueName: "kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx") pod "network-check-target-nlzd4" (UID: "22f62d88-7d18-4fc4-a8b1-44efd0814325") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:39.557875 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.557848 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqvjp\" (UniqueName: \"kubernetes.io/projected/4c668390-7023-447b-92b4-e95d0c65f6cd-kube-api-access-xqvjp\") pod \"iptables-alerter-x2h2w\" (UID: \"4c668390-7023-447b-92b4-e95d0c65f6cd\") " pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.558157 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.558136 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9ncj\" (UniqueName: \"kubernetes.io/projected/65e67b5d-323b-47d6-ac53-b3da03f832e6-kube-api-access-p9ncj\") pod \"multus-w4nmn\" (UID: \"65e67b5d-323b-47d6-ac53-b3da03f832e6\") " pod="openshift-multus/multus-w4nmn" Apr 24 19:06:39.558514 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.558492 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-944x9\" (UniqueName: \"kubernetes.io/projected/76172245-47dc-4f2f-90c9-d345a816e233-kube-api-access-944x9\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:39.625584 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.625553 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gbmct" Apr 24 19:06:39.632386 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.632362 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" Apr 24 19:06:39.642024 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.641995 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-29rc7" Apr 24 19:06:39.646795 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.646767 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:06:39.654501 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.654475 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:39.660517 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.660495 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9vthc" Apr 24 19:06:39.667153 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.667131 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-x2h2w" Apr 24 19:06:39.672809 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:39.672786 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w4nmn" Apr 24 19:06:40.048632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.048597 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:40.048632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.048644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:40.048890 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:40.048761 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:40.048890 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:40.048854 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret podName:e583cac8-fcbc-4fa3-a2f4-d8b1fad99146 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:41.048834372 +0000 UTC m=+4.160670706 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret") pod "global-pull-secret-syncer-dns5q" (UID: "e583cac8-fcbc-4fa3-a2f4-d8b1fad99146") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:40.048890 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:40.048881 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:40.049050 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:40.048940 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs podName:76172245-47dc-4f2f-90c9-d345a816e233 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:41.048922873 +0000 UTC m=+4.160759234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs") pod "network-metrics-daemon-nghhh" (UID: "76172245-47dc-4f2f-90c9-d345a816e233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:40.121461 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:40.121432 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cf109ce_0195_4971_a0e3_86ad92c8ed1f.slice/crio-6dc9b822d4b4b1e34bc7c5d0fa7f1f5dd934e5b00b1d1b521b0cd19a90517a6e WatchSource:0}: Error finding container 6dc9b822d4b4b1e34bc7c5d0fa7f1f5dd934e5b00b1d1b521b0cd19a90517a6e: Status 404 returned error can't find the container with id 6dc9b822d4b4b1e34bc7c5d0fa7f1f5dd934e5b00b1d1b521b0cd19a90517a6e Apr 24 19:06:40.122616 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:40.122592 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65e67b5d_323b_47d6_ac53_b3da03f832e6.slice/crio-9f50485a0f0f1197acad79bec6fa23f5802aee855b7872d3b56eb86a615525b8 WatchSource:0}: Error finding container 9f50485a0f0f1197acad79bec6fa23f5802aee855b7872d3b56eb86a615525b8: Status 404 returned error can't find the container with id 9f50485a0f0f1197acad79bec6fa23f5802aee855b7872d3b56eb86a615525b8 Apr 24 19:06:40.123745 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:40.123723 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b140811_2690_4060_a32d_14cb088e3605.slice/crio-b5cfb2793976019a8b3b04fbd1894ba4b7282147b0827cec249d3b8e5f086a3f WatchSource:0}: Error finding container b5cfb2793976019a8b3b04fbd1894ba4b7282147b0827cec249d3b8e5f086a3f: Status 404 returned error can't find the container with id b5cfb2793976019a8b3b04fbd1894ba4b7282147b0827cec249d3b8e5f086a3f Apr 24 19:06:40.127595 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:40.127564 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod254ea4ca_f9d7_452a_9868_bdf3ef96512c.slice/crio-4e5ce185493a357688a5195c68473c457748cd044653167bc878cd587e7e4cc4 WatchSource:0}: Error finding container 4e5ce185493a357688a5195c68473c457748cd044653167bc878cd587e7e4cc4: Status 404 returned error can't find the container with id 4e5ce185493a357688a5195c68473c457748cd044653167bc878cd587e7e4cc4 Apr 24 19:06:40.129281 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:40.129252 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96d8dd3_8216_48b7_a304_75026c92aa95.slice/crio-b2e200edc413b847a688c8319431ba1f1d902fb8c709b391acac3b01718dccd2 WatchSource:0}: Error finding container b2e200edc413b847a688c8319431ba1f1d902fb8c709b391acac3b01718dccd2: Status 404 returned error can't find the container with id b2e200edc413b847a688c8319431ba1f1d902fb8c709b391acac3b01718dccd2 Apr 24 19:06:40.129938 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:40.129914 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a6d24c7_d9ec_4b20_98cd_af5850b0074f.slice/crio-80f8bf3f12aadcb776922fc1fb0708e0b347b877049ae6c8c9718273b60df120 WatchSource:0}: Error finding container 80f8bf3f12aadcb776922fc1fb0708e0b347b877049ae6c8c9718273b60df120: Status 404 returned error can't find the container with id 80f8bf3f12aadcb776922fc1fb0708e0b347b877049ae6c8c9718273b60df120 Apr 24 19:06:40.130682 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:40.130550 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc00a2b3_d877_4828_bc79_de040ea70887.slice/crio-d0c3947892fd217f46b6b9b9a273054437e316f7eb90f16e75cb1750bf145883 WatchSource:0}: Error finding container d0c3947892fd217f46b6b9b9a273054437e316f7eb90f16e75cb1750bf145883: Status 404 returned error can't find the container with id d0c3947892fd217f46b6b9b9a273054437e316f7eb90f16e75cb1750bf145883 Apr 24 19:06:40.131604 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:06:40.131568 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c668390_7023_447b_92b4_e95d0c65f6cd.slice/crio-da6b8650fcf6c47a9be0454f13ae492ae80253fe365389ebfaae500b431e653c WatchSource:0}: Error finding container da6b8650fcf6c47a9be0454f13ae492ae80253fe365389ebfaae500b431e653c: Status 404 returned error can't find the container with id da6b8650fcf6c47a9be0454f13ae492ae80253fe365389ebfaae500b431e653c Apr 24 19:06:40.149910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.149879 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dhx\" (UniqueName: \"kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx\") pod \"network-check-target-nlzd4\" (UID: \"22f62d88-7d18-4fc4-a8b1-44efd0814325\") " pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:40.150052 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:40.150028 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:40.150052 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:40.150047 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:40.150150 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:40.150056 2568 projected.go:194] Error preparing data for projected volume kube-api-access-x2dhx for pod openshift-network-diagnostics/network-check-target-nlzd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:40.150150 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:40.150115 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx podName:22f62d88-7d18-4fc4-a8b1-44efd0814325 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:41.150088058 +0000 UTC m=+4.261924381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2dhx" (UniqueName: "kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx") pod "network-check-target-nlzd4" (UID: "22f62d88-7d18-4fc4-a8b1-44efd0814325") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:40.377661 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.377450 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:38 +0000 UTC" deadline="2027-10-21 22:56:53.917519692 +0000 UTC" Apr 24 19:06:40.377661 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.377655 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13083h50m13.539868778s" Apr 24 19:06:40.454254 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.454162 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:40.454414 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:40.454281 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:40.462473 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.462437 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" event={"ID":"0b140811-2690-4060-a32d-14cb088e3605","Type":"ContainerStarted","Data":"b5cfb2793976019a8b3b04fbd1894ba4b7282147b0827cec249d3b8e5f086a3f"} Apr 24 19:06:40.464398 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.464369 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w4nmn" event={"ID":"65e67b5d-323b-47d6-ac53-b3da03f832e6","Type":"ContainerStarted","Data":"9f50485a0f0f1197acad79bec6fa23f5802aee855b7872d3b56eb86a615525b8"} Apr 24 19:06:40.466250 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.466222 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9vthc" event={"ID":"2cf109ce-0195-4971-a0e3-86ad92c8ed1f","Type":"ContainerStarted","Data":"6dc9b822d4b4b1e34bc7c5d0fa7f1f5dd934e5b00b1d1b521b0cd19a90517a6e"} Apr 24 19:06:40.468666 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.468635 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-x2h2w" event={"ID":"4c668390-7023-447b-92b4-e95d0c65f6cd","Type":"ContainerStarted","Data":"da6b8650fcf6c47a9be0454f13ae492ae80253fe365389ebfaae500b431e653c"} Apr 24 19:06:40.469988 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.469961 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gbmct" event={"ID":"bc00a2b3-d877-4828-bc79-de040ea70887","Type":"ContainerStarted","Data":"d0c3947892fd217f46b6b9b9a273054437e316f7eb90f16e75cb1750bf145883"} Apr 24 19:06:40.472164 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.472096 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerStarted","Data":"80f8bf3f12aadcb776922fc1fb0708e0b347b877049ae6c8c9718273b60df120"} Apr 24 19:06:40.473754 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.473729 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-29rc7" event={"ID":"254ea4ca-f9d7-452a-9868-bdf3ef96512c","Type":"ContainerStarted","Data":"4e5ce185493a357688a5195c68473c457748cd044653167bc878cd587e7e4cc4"} Apr 24 19:06:40.479270 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.478615 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal" event={"ID":"a2e4dc57eaf3ecec9842fb4e7d99fd2d","Type":"ContainerStarted","Data":"35b11cc5fa450003ad7e3c6f15e58e5c13a818f25cc776c21239abeed6c856d5"} Apr 24 19:06:40.481711 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:40.481681 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-67l2j" event={"ID":"a96d8dd3-8216-48b7-a304-75026c92aa95","Type":"ContainerStarted","Data":"b2e200edc413b847a688c8319431ba1f1d902fb8c709b391acac3b01718dccd2"} Apr 24 19:06:41.058506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:41.057545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:41.058506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:41.057599 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:41.058506 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.057790 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:41.058506 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.057855 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs podName:76172245-47dc-4f2f-90c9-d345a816e233 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:43.057834841 +0000 UTC m=+6.169671168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs") pod "network-metrics-daemon-nghhh" (UID: "76172245-47dc-4f2f-90c9-d345a816e233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:41.058506 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.058324 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:41.058506 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.058451 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret podName:e583cac8-fcbc-4fa3-a2f4-d8b1fad99146 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:43.058432748 +0000 UTC m=+6.170269077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret") pod "global-pull-secret-syncer-dns5q" (UID: "e583cac8-fcbc-4fa3-a2f4-d8b1fad99146") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:41.158533 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:41.158483 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dhx\" (UniqueName: \"kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx\") pod \"network-check-target-nlzd4\" (UID: \"22f62d88-7d18-4fc4-a8b1-44efd0814325\") " pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:41.158756 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.158741 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:41.158832 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.158763 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:41.158832 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.158775 2568 projected.go:194] Error preparing data for projected volume kube-api-access-x2dhx for pod openshift-network-diagnostics/network-check-target-nlzd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:41.158932 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.158863 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx podName:22f62d88-7d18-4fc4-a8b1-44efd0814325 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:43.158820427 +0000 UTC m=+6.270656757 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2dhx" (UniqueName: "kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx") pod "network-check-target-nlzd4" (UID: "22f62d88-7d18-4fc4-a8b1-44efd0814325") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:41.408533 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:41.408500 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:41.457053 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:41.456427 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:41.457053 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:41.456431 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:41.457053 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.456596 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:41.458856 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:41.457521 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:41.502157 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:41.502120 2568 generic.go:358] "Generic (PLEG): container finished" podID="d2457aa0bf4a936b2d49c728267a6998" containerID="2b49e57454b9bd61a10c90b59c7c7a9d12d802092d2edd2d3512d20c931ed2b0" exitCode=0 Apr 24 19:06:41.502339 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:41.502263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" event={"ID":"d2457aa0bf4a936b2d49c728267a6998","Type":"ContainerDied","Data":"2b49e57454b9bd61a10c90b59c7c7a9d12d802092d2edd2d3512d20c931ed2b0"} Apr 24 19:06:41.520194 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:41.517487 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-52.ec2.internal" podStartSLOduration=3.517467979 podStartE2EDuration="3.517467979s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:40.494437792 +0000 UTC m=+3.606274139" watchObservedRunningTime="2026-04-24 19:06:41.517467979 +0000 UTC m=+4.629304327" Apr 24 19:06:42.454965 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:42.454930 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:42.455467 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:42.455064 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:42.509912 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:42.509876 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" event={"ID":"d2457aa0bf4a936b2d49c728267a6998","Type":"ContainerStarted","Data":"114bc63d3b8387ea7413d407ec1f53d258bd3a70becd72fd45b307e13cedf190"} Apr 24 19:06:43.077249 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:43.076354 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:43.077249 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:43.076440 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:43.077249 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.076605 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:43.077249 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.076710 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs podName:76172245-47dc-4f2f-90c9-d345a816e233 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:47.076690242 +0000 UTC m=+10.188526570 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs") pod "network-metrics-daemon-nghhh" (UID: "76172245-47dc-4f2f-90c9-d345a816e233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:43.077249 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.077166 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:43.077249 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.077215 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret podName:e583cac8-fcbc-4fa3-a2f4-d8b1fad99146 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:47.077200298 +0000 UTC m=+10.189036629 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret") pod "global-pull-secret-syncer-dns5q" (UID: "e583cac8-fcbc-4fa3-a2f4-d8b1fad99146") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:43.177546 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:43.177482 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dhx\" (UniqueName: \"kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx\") pod \"network-check-target-nlzd4\" (UID: \"22f62d88-7d18-4fc4-a8b1-44efd0814325\") " pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:43.177730 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.177624 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:43.177730 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.177645 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:43.177730 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.177660 2568 projected.go:194] Error preparing data for projected volume kube-api-access-x2dhx for pod openshift-network-diagnostics/network-check-target-nlzd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:43.177730 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.177720 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx podName:22f62d88-7d18-4fc4-a8b1-44efd0814325 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:47.17770239 +0000 UTC m=+10.289538720 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2dhx" (UniqueName: "kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx") pod "network-check-target-nlzd4" (UID: "22f62d88-7d18-4fc4-a8b1-44efd0814325") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:43.454837 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:43.454802 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:43.454837 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:43.454822 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:43.455395 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.454921 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:43.455395 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:43.455067 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:44.454233 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:44.454196 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:44.454438 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:44.454394 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:45.454533 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:45.454500 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:45.455126 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:45.454500 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:45.455126 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:45.454650 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:45.455126 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:45.454688 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:46.454772 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:46.454736 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:46.455300 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:46.454873 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:47.111452 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:47.111404 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:47.111643 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:47.111461 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:47.111643 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.111636 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:47.111765 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.111703 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs podName:76172245-47dc-4f2f-90c9-d345a816e233 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:55.111684442 +0000 UTC m=+18.223520772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs") pod "network-metrics-daemon-nghhh" (UID: "76172245-47dc-4f2f-90c9-d345a816e233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:47.112037 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.111941 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:47.112037 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.112013 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret podName:e583cac8-fcbc-4fa3-a2f4-d8b1fad99146 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:55.111995055 +0000 UTC m=+18.223831400 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret") pod "global-pull-secret-syncer-dns5q" (UID: "e583cac8-fcbc-4fa3-a2f4-d8b1fad99146") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:47.212200 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:47.212159 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dhx\" (UniqueName: \"kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx\") pod \"network-check-target-nlzd4\" (UID: \"22f62d88-7d18-4fc4-a8b1-44efd0814325\") " pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:47.212395 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.212364 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:47.212395 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.212385 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:47.212513 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.212397 2568 projected.go:194] Error preparing data for projected volume kube-api-access-x2dhx for pod openshift-network-diagnostics/network-check-target-nlzd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:47.212513 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.212457 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx podName:22f62d88-7d18-4fc4-a8b1-44efd0814325 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:55.212438455 +0000 UTC m=+18.324274802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2dhx" (UniqueName: "kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx") pod "network-check-target-nlzd4" (UID: "22f62d88-7d18-4fc4-a8b1-44efd0814325") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:47.456209 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:47.455543 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:47.456209 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.455668 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:47.456209 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:47.456013 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:47.456209 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:47.456117 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:48.454960 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:48.454925 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:48.455161 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:48.455049 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:49.454852 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:49.454813 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:49.455357 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:49.454813 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:49.455357 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:49.454968 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:49.455357 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:49.455024 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:50.454824 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:50.454783 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:50.454998 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:50.454921 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:51.454789 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:51.454702 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:51.454789 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:51.454736 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:51.455027 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:51.454842 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:51.455027 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:51.454933 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:52.454119 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:52.454069 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:52.454280 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:52.454225 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:53.455120 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:53.455076 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:53.455576 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:53.455077 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:53.455627 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:53.455608 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:53.457980 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:53.455925 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:54.454866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:54.454829 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:54.455191 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:54.454959 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:55.172374 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:55.172333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:55.172648 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:55.172392 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:55.172648 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.172508 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:55.172648 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.172515 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:55.172648 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.172588 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret podName:e583cac8-fcbc-4fa3-a2f4-d8b1fad99146 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.1725671 +0000 UTC m=+34.284403444 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret") pod "global-pull-secret-syncer-dns5q" (UID: "e583cac8-fcbc-4fa3-a2f4-d8b1fad99146") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:55.172648 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.172649 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs podName:76172245-47dc-4f2f-90c9-d345a816e233 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.172631108 +0000 UTC m=+34.284467432 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs") pod "network-metrics-daemon-nghhh" (UID: "76172245-47dc-4f2f-90c9-d345a816e233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:55.273478 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:55.273426 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dhx\" (UniqueName: \"kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx\") pod \"network-check-target-nlzd4\" (UID: \"22f62d88-7d18-4fc4-a8b1-44efd0814325\") " pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:55.273672 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.273638 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:55.273672 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.273669 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:55.273864 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.273683 2568 projected.go:194] Error preparing data for projected volume kube-api-access-x2dhx for pod openshift-network-diagnostics/network-check-target-nlzd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:55.273864 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.273745 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx podName:22f62d88-7d18-4fc4-a8b1-44efd0814325 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.273726567 +0000 UTC m=+34.385562907 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2dhx" (UniqueName: "kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx") pod "network-check-target-nlzd4" (UID: "22f62d88-7d18-4fc4-a8b1-44efd0814325") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:55.454718 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:55.454630 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:55.454880 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.454791 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:55.454880 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:55.454834 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:55.455002 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:55.454907 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:56.454284 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:56.454247 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:56.454676 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:56.454354 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:57.454903 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.454738 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:57.455487 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.454799 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:57.455487 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:57.454964 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:57.455487 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:57.455056 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:57.537435 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.537398 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-67l2j" event={"ID":"a96d8dd3-8216-48b7-a304-75026c92aa95","Type":"ContainerStarted","Data":"1be3c9ed7285dacd9b07cf50d88e0297d14456bce1a7dd2e239d7a4f1fb8b1ab"} Apr 24 19:06:57.538903 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.538873 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" event={"ID":"0b140811-2690-4060-a32d-14cb088e3605","Type":"ContainerStarted","Data":"32e41dc15b4af27173b33e6fe30422767551a0d9d0415dc2ab490e5f2c1828e4"} Apr 24 19:06:57.540334 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.540301 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w4nmn" event={"ID":"65e67b5d-323b-47d6-ac53-b3da03f832e6","Type":"ContainerStarted","Data":"e34f002b6033a34940957b247509fde30878a5befdac5d122ae8d0e90e8ad9ca"} Apr 24 19:06:57.541799 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.541770 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9vthc" event={"ID":"2cf109ce-0195-4971-a0e3-86ad92c8ed1f","Type":"ContainerStarted","Data":"90169ec2ff0022f6114ce9714f797f81b91a4d0c5de55d956571dfd5c7de7a18"} Apr 24 19:06:57.543212 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.543175 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gbmct" event={"ID":"bc00a2b3-d877-4828-bc79-de040ea70887","Type":"ContainerStarted","Data":"24e900b019e178c0dc703ae55dbfabb6519b6bd1c484fd0ba72e2924b0769bae"} Apr 24 19:06:57.545006 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.544989 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:06:57.545343 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.545317 2568 generic.go:358] "Generic (PLEG): container finished" podID="4a6d24c7-d9ec-4b20-98cd-af5850b0074f" containerID="bd3e74333ad385f6d1dc076bcea1f085aef412a600ba818c527848b8b2be6655" exitCode=1 Apr 24 19:06:57.545451 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.545338 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerStarted","Data":"9023ebc77db068f3f4823c6e0da6368ff7cd5564678f8c8f69d0e2d5303f47fc"} Apr 24 19:06:57.545451 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.545373 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerDied","Data":"bd3e74333ad385f6d1dc076bcea1f085aef412a600ba818c527848b8b2be6655"} Apr 24 19:06:57.545451 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.545388 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerStarted","Data":"bc824fe19f1fb483767d4e1764d84bdb0eb85da8692a14756b4fb710e9f0766f"} Apr 24 19:06:57.546768 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.546739 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-29rc7" event={"ID":"254ea4ca-f9d7-452a-9868-bdf3ef96512c","Type":"ContainerStarted","Data":"6417d5c11ae05b2a6308c445c61bca00c47a73f6cbc916977121355109c7f914"} Apr 24 19:06:57.551290 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.551239 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-52.ec2.internal" podStartSLOduration=19.551223644 podStartE2EDuration="19.551223644s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:42.523692214 +0000 UTC m=+5.635528561" watchObservedRunningTime="2026-04-24 19:06:57.551223644 +0000 UTC m=+20.663059990" Apr 24 19:06:57.551654 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.551618 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-67l2j" podStartSLOduration=3.7778374169999998 podStartE2EDuration="20.551607361s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.131276696 +0000 UTC m=+3.243113020" lastFinishedPulling="2026-04-24 19:06:56.90504663 +0000 UTC m=+20.016882964" observedRunningTime="2026-04-24 19:06:57.551177406 +0000 UTC m=+20.663013752" watchObservedRunningTime="2026-04-24 19:06:57.551607361 +0000 UTC m=+20.663443768" Apr 24 19:06:57.565617 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.565552 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-29rc7" podStartSLOduration=3.790538958 podStartE2EDuration="20.565532278s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.129977003 +0000 UTC m=+3.241813330" lastFinishedPulling="2026-04-24 19:06:56.904970326 +0000 UTC m=+20.016806650" observedRunningTime="2026-04-24 19:06:57.564885183 +0000 UTC m=+20.676721529" watchObservedRunningTime="2026-04-24 19:06:57.565532278 +0000 UTC m=+20.677368625" Apr 24 19:06:57.578838 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.578793 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9vthc" podStartSLOduration=3.775703645 podStartE2EDuration="20.578778076s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.12443379 +0000 UTC m=+3.236270131" lastFinishedPulling="2026-04-24 19:06:56.927508238 +0000 UTC m=+20.039344562" observedRunningTime="2026-04-24 19:06:57.578688417 +0000 UTC m=+20.690524761" watchObservedRunningTime="2026-04-24 19:06:57.578778076 +0000 UTC m=+20.690614419" Apr 24 19:06:57.595748 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:57.595684 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w4nmn" podStartSLOduration=3.756283158 podStartE2EDuration="20.59566554s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.124807897 +0000 UTC m=+3.236644235" lastFinishedPulling="2026-04-24 19:06:56.964190282 +0000 UTC m=+20.076026617" observedRunningTime="2026-04-24 19:06:57.595303476 +0000 UTC m=+20.707139847" watchObservedRunningTime="2026-04-24 19:06:57.59566554 +0000 UTC m=+20.707501890" Apr 24 19:06:58.227998 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.227834 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 19:06:58.419476 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.419360 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T19:06:58.227990547Z","UUID":"d6b58e61-b00b-4c4a-a791-3660d358c252","Handler":null,"Name":"","Endpoint":""} Apr 24 19:06:58.422015 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.421990 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 19:06:58.422172 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.422022 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 19:06:58.454933 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.454905 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:06:58.455424 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:58.455008 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:06:58.550562 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.550504 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" event={"ID":"0b140811-2690-4060-a32d-14cb088e3605","Type":"ContainerStarted","Data":"66e942d45bf73a31298198c76876561abcd208b6079431d0c65d3932667781af"} Apr 24 19:06:58.553471 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.553436 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-x2h2w" event={"ID":"4c668390-7023-447b-92b4-e95d0c65f6cd","Type":"ContainerStarted","Data":"4c351e81b9c83d4af06d14aa99d8d4d2b117e90ce8ef332cb74f97333244de3a"} Apr 24 19:06:58.554584 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.554560 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc00a2b3-d877-4828-bc79-de040ea70887" containerID="24e900b019e178c0dc703ae55dbfabb6519b6bd1c484fd0ba72e2924b0769bae" exitCode=0 Apr 24 19:06:58.554711 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.554638 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gbmct" event={"ID":"bc00a2b3-d877-4828-bc79-de040ea70887","Type":"ContainerDied","Data":"24e900b019e178c0dc703ae55dbfabb6519b6bd1c484fd0ba72e2924b0769bae"} Apr 24 19:06:58.557047 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.557030 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:06:58.557523 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.557488 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerStarted","Data":"e01a4cf788a063fea15b73297cb98d2bf42eec292d7afa1cee87968d89cfe134"} Apr 24 19:06:58.557632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.557533 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerStarted","Data":"80daaa9dadaf8e8d254e862e5b5a68adac2f71d45db7f1f1961dd121b7f20df4"} Apr 24 19:06:58.557632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.557548 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerStarted","Data":"d0564b027605e2bcea13749f4eced0481917c1f7f60bbcd9c0135f0fd40802c7"} Apr 24 19:06:58.567199 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.567152 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-x2h2w" podStartSLOduration=4.794181969 podStartE2EDuration="21.567139462s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.154552513 +0000 UTC m=+3.266388836" lastFinishedPulling="2026-04-24 19:06:56.927509991 +0000 UTC m=+20.039346329" observedRunningTime="2026-04-24 19:06:58.566645273 +0000 UTC m=+21.678481619" watchObservedRunningTime="2026-04-24 19:06:58.567139462 +0000 UTC m=+21.678975807" Apr 24 19:06:58.788705 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.788613 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:58.789355 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:58.789338 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:06:59.454864 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:59.454822 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:06:59.454864 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:59.454863 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:06:59.455443 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:59.454961 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:06:59.455443 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:06:59.455116 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:06:59.562114 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:59.562002 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" event={"ID":"0b140811-2690-4060-a32d-14cb088e3605","Type":"ContainerStarted","Data":"8927400d5ea3bc5e238568c552d32022bfeb7eac32c84468eee164b3706fcd40"} Apr 24 19:06:59.579545 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:06:59.579492 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gkmfv" podStartSLOduration=3.449854952 podStartE2EDuration="22.579477683s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.125842518 +0000 UTC m=+3.237678845" lastFinishedPulling="2026-04-24 19:06:59.255465251 +0000 UTC m=+22.367301576" observedRunningTime="2026-04-24 19:06:59.577833902 +0000 UTC m=+22.689670249" watchObservedRunningTime="2026-04-24 19:06:59.579477683 +0000 UTC m=+22.691314028" Apr 24 19:07:00.455025 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:00.454830 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:00.455256 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:00.455132 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:07:00.566951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:00.566914 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:07:00.567335 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:00.567297 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerStarted","Data":"6b742d299341b02fb69e5fa97ca66b2679e0b36e55cc29f500d5f728abbd2580"} Apr 24 19:07:00.567448 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:00.567385 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:07:01.454639 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:01.454606 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:01.454639 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:01.454650 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:01.454908 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:01.454748 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:07:01.454967 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:01.454911 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:07:02.454488 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:02.454455 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:02.454985 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:02.454567 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:07:03.244330 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.244135 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:07:03.244488 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.244411 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:07:03.244673 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.244653 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-67l2j" Apr 24 19:07:03.454703 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.454677 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:03.454703 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.454701 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:03.455381 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:03.454771 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:07:03.455381 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:03.454904 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:07:03.577146 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.577050 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc00a2b3-d877-4828-bc79-de040ea70887" containerID="b81742f9e92c134ec097f8dfafcde9806dce294dba3df32f2b7c80abf350e214" exitCode=0 Apr 24 19:07:03.577294 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.577142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gbmct" event={"ID":"bc00a2b3-d877-4828-bc79-de040ea70887","Type":"ContainerDied","Data":"b81742f9e92c134ec097f8dfafcde9806dce294dba3df32f2b7c80abf350e214"} Apr 24 19:07:03.580510 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.580491 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:07:03.580886 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.580862 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerStarted","Data":"61d6d36cbc65fd2a304ef14123397ec0b48c50690e75970f89d35fbb06bc80d5"} Apr 24 19:07:03.581172 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.581149 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:07:03.581278 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.581186 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:07:03.581335 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.581295 2568 scope.go:117] "RemoveContainer" containerID="bd3e74333ad385f6d1dc076bcea1f085aef412a600ba818c527848b8b2be6655" Apr 24 19:07:03.597770 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.597644 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:07:03.597865 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:03.597790 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:07:04.454911 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.454876 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:04.455388 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:04.455030 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:07:04.512560 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.512488 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dns5q"] Apr 24 19:07:04.513185 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.513141 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nghhh"] Apr 24 19:07:04.513309 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.513292 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:04.513450 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:04.513415 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:07:04.513695 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.513669 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nlzd4"] Apr 24 19:07:04.513796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.513782 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:04.513880 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:04.513863 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:07:04.584339 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.584302 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc00a2b3-d877-4828-bc79-de040ea70887" containerID="7211b5e35cbf245286ac4a1b5bb7b1f30db854bdf2dd20d157c833981ba994f6" exitCode=0 Apr 24 19:07:04.584491 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.584361 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gbmct" event={"ID":"bc00a2b3-d877-4828-bc79-de040ea70887","Type":"ContainerDied","Data":"7211b5e35cbf245286ac4a1b5bb7b1f30db854bdf2dd20d157c833981ba994f6"} Apr 24 19:07:04.587926 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.587906 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:07:04.588334 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.588313 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" event={"ID":"4a6d24c7-d9ec-4b20-98cd-af5850b0074f","Type":"ContainerStarted","Data":"27c0acca9c1c930eb04d60960da140f3d42668ee81628f58844d7d7a8c8f9746"} Apr 24 19:07:04.588418 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.588333 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:04.588456 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.588423 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:07:04.588456 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:04.588427 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:07:04.636193 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:04.636138 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" podStartSLOduration=10.762666269 podStartE2EDuration="27.636122459s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.154390804 +0000 UTC m=+3.266227141" lastFinishedPulling="2026-04-24 19:06:57.027846992 +0000 UTC m=+20.139683331" observedRunningTime="2026-04-24 19:07:04.635854905 +0000 UTC m=+27.747691264" watchObservedRunningTime="2026-04-24 19:07:04.636122459 +0000 UTC m=+27.747958799" Apr 24 19:07:05.592290 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:05.592197 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc00a2b3-d877-4828-bc79-de040ea70887" containerID="59936bb8a743bc327f98254a6ba83fe81b91b6335a773312bd0a14f31fe3a809" exitCode=0 Apr 24 19:07:05.592290 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:05.592279 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gbmct" event={"ID":"bc00a2b3-d877-4828-bc79-de040ea70887","Type":"ContainerDied","Data":"59936bb8a743bc327f98254a6ba83fe81b91b6335a773312bd0a14f31fe3a809"} Apr 24 19:07:05.592794 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:05.592456 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:07:06.455002 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:06.454784 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:06.455177 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:06.454837 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:06.455177 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:06.455123 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:07:06.455177 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:06.454860 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:06.455357 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:06.455216 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:07:06.455357 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:06.455299 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:07:08.441114 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:08.441060 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:07:08.441707 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:08.441311 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:07:08.453892 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:08.453842 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2thj7" Apr 24 19:07:08.454111 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:08.454073 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:08.454225 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:08.454113 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:08.454225 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:08.454078 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:08.454225 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:08.454209 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dns5q" podUID="e583cac8-fcbc-4fa3-a2f4-d8b1fad99146" Apr 24 19:07:08.454385 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:08.454307 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nlzd4" podUID="22f62d88-7d18-4fc4-a8b1-44efd0814325" Apr 24 19:07:08.454451 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:08.454406 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nghhh" podUID="76172245-47dc-4f2f-90c9-d345a816e233" Apr 24 19:07:10.255120 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.255075 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-52.ec2.internal" event="NodeReady" Apr 24 19:07:10.255807 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.255239 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 19:07:10.291384 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.291345 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5db89954c9-qmt95"] Apr 24 19:07:10.294172 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.293882 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.295566 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.295469 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-bzksz"] Apr 24 19:07:10.297050 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.296798 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-czn8s\"" Apr 24 19:07:10.297253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.297144 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 19:07:10.297923 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.297764 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 19:07:10.297923 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.297839 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 19:07:10.298083 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.297928 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl"] Apr 24 19:07:10.298164 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.298115 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.299696 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.299667 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p"] Apr 24 19:07:10.299795 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.299777 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl" Apr 24 19:07:10.302004 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.301983 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5dd48cf8b4-xldzq"] Apr 24 19:07:10.303303 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.303277 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 19:07:10.303753 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.303735 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 19:07:10.306253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.306232 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.307019 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.306999 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-c7256\"" Apr 24 19:07:10.307136 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.307016 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-z5nx6\"" Apr 24 19:07:10.307136 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.307030 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.307136 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.307044 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c"] Apr 24 19:07:10.307293 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.307195 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.307351 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.307330 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.307539 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.307520 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.307539 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.307533 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.310037 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.310001 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-zsl4c"] Apr 24 19:07:10.310204 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.310183 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.312916 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.312890 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw"] Apr 24 19:07:10.314638 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.313297 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.314638 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.313350 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 19:07:10.315823 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.315031 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 19:07:10.315823 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.315393 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nnzs7\"" Apr 24 19:07:10.315823 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.315586 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.315823 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.315396 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 19:07:10.316072 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.315845 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 19:07:10.316072 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.315994 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 19:07:10.316072 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.316007 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 19:07:10.316072 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.316055 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.316578 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.316273 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 19:07:10.316578 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.316289 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.316578 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.316562 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.317315 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.317274 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9"] Apr 24 19:07:10.317543 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.317522 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:10.321753 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.321724 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-5cfhz\"" Apr 24 19:07:10.323470 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.323448 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6"] Apr 24 19:07:10.323594 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.323569 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9" Apr 24 19:07:10.324160 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.324140 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 19:07:10.324253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.324218 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-2tzpd\"" Apr 24 19:07:10.325507 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.325489 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl"] Apr 24 19:07:10.325627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.325612 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5db89954c9-qmt95"] Apr 24 19:07:10.325769 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.325751 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.326449 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.326429 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.326541 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.326505 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.327051 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.327031 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p"] Apr 24 19:07:10.329741 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.329720 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 19:07:10.333607 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.333567 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 19:07:10.333708 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.333677 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 19:07:10.334508 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.334488 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-zsl4c"] Apr 24 19:07:10.343200 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.343176 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.343200 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.343195 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.343367 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.343180 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 19:07:10.343660 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.343641 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-rjhj9\"" Apr 24 19:07:10.358285 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.358258 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 19:07:10.358998 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.358630 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.358998 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.358703 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 19:07:10.358998 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.358807 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.358998 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.358913 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.359331 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.359195 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.359550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.359503 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw"] Apr 24 19:07:10.360377 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.360355 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9"] Apr 24 19:07:10.367026 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.366030 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-lk99m\"" Apr 24 19:07:10.367026 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.366344 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-czt27\"" Apr 24 19:07:10.368200 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.367674 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.368200 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.367828 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.368200 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.368053 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-6dffk\"" Apr 24 19:07:10.370632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.368880 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 19:07:10.371388 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.371303 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5dd48cf8b4-xldzq"] Apr 24 19:07:10.372621 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.372600 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sfgb9"] Apr 24 19:07:10.376810 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.376790 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:10.377538 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.377511 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6"] Apr 24 19:07:10.377642 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.377552 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-bzksz"] Apr 24 19:07:10.378293 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.378255 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c"] Apr 24 19:07:10.379704 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.379682 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 19:07:10.380208 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.380149 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.380302 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.380230 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sfgb9"] Apr 24 19:07:10.380363 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.380320 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.380455 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.380436 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8pmm\"" Apr 24 19:07:10.383618 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.383582 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 19:07:10.387026 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387003 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5d21b7cf-8c3d-459b-a502-f049a7353d9f-snapshots\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.387156 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387047 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpkz2\" (UniqueName: \"kubernetes.io/projected/10fdb32e-ef53-491d-922c-4d9e4f2531f0-kube-api-access-xpkz2\") pod \"volume-data-source-validator-7c6cbb6c87-smbpl\" (UID: \"10fdb32e-ef53-491d-922c-4d9e4f2531f0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl" Apr 24 19:07:10.387156 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387076 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.387274 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387173 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-certificates\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.387274 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387201 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj5m8\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-kube-api-access-gj5m8\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.387387 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387299 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-bound-sa-token\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.387387 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387342 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d21b7cf-8c3d-459b-a502-f049a7353d9f-tmp\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.387479 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387384 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d21b7cf-8c3d-459b-a502-f049a7353d9f-service-ca-bundle\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.387479 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d21b7cf-8c3d-459b-a502-f049a7353d9f-serving-cert\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.387479 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7503386e-4776-4896-bd44-4ef455ac6b98-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.387479 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmstd\" (UniqueName: \"kubernetes.io/projected/7503386e-4776-4896-bd44-4ef455ac6b98-kube-api-access-hmstd\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.387633 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387557 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-ca-trust-extracted\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.387633 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387589 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-image-registry-private-configuration\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.387734 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387634 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.387734 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387670 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-trusted-ca\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.387734 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387702 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d21b7cf-8c3d-459b-a502-f049a7353d9f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.387734 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387729 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxvrd\" (UniqueName: \"kubernetes.io/projected/5d21b7cf-8c3d-459b-a502-f049a7353d9f-kube-api-access-cxvrd\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.387903 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.387760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-installation-pull-secrets\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.414972 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.414938 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qxfsm"] Apr 24 19:07:10.416956 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.416922 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.419529 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.419503 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rh56f\"" Apr 24 19:07:10.419759 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.419738 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.421423 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.421402 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 19:07:10.421542 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.421405 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.421603 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.421550 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 19:07:10.431639 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.431612 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qxfsm"] Apr 24 19:07:10.455025 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.454991 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:10.455225 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.455037 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:10.455225 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.454991 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:10.457750 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.457721 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 19:07:10.457972 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.457946 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjm4v\"" Apr 24 19:07:10.458282 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.458262 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 19:07:10.458379 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.458330 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h4gdd\"" Apr 24 19:07:10.488570 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488534 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-ca-trust-extracted\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.488742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3d37867-8a80-4198-9320-281682c54121-trusted-ca\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.488742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488640 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf5k8\" (UniqueName: \"kubernetes.io/projected/f3d37867-8a80-4198-9320-281682c54121-kube-api-access-rf5k8\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.488742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488686 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3d37867-8a80-4198-9320-281682c54121-serving-cert\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.488742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488720 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-installation-pull-secrets\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.488910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:10.488910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488795 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a18605e-85a6-4562-acf8-4bef99990528-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2njp6\" (UID: \"3a18605e-85a6-4562-acf8-4bef99990528\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.488910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488852 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.488910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488902 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpkz2\" (UniqueName: \"kubernetes.io/projected/10fdb32e-ef53-491d-922c-4d9e4f2531f0-kube-api-access-xpkz2\") pod \"volume-data-source-validator-7c6cbb6c87-smbpl\" (UID: \"10fdb32e-ef53-491d-922c-4d9e4f2531f0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl" Apr 24 19:07:10.489149 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488936 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-certificates\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.489149 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488962 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmkhj\" (UniqueName: \"kubernetes.io/projected/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-kube-api-access-jmkhj\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:10.489149 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.488997 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-ca-trust-extracted\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.489149 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489045 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:10.489149 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d37867-8a80-4198-9320-281682c54121-config\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.489149 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489154 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d21b7cf-8c3d-459b-a502-f049a7353d9f-tmp\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.489466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d21b7cf-8c3d-459b-a502-f049a7353d9f-service-ca-bundle\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.489466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489219 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz84l\" (UniqueName: \"kubernetes.io/projected/134e19f5-38b3-4160-8673-d35beeb0ed89-kube-api-access-cz84l\") pod \"network-check-source-8894fc9bd-6gmp9\" (UID: \"134e19f5-38b3-4160-8673-d35beeb0ed89\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9" Apr 24 19:07:10.489466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489249 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d21b7cf-8c3d-459b-a502-f049a7353d9f-serving-cert\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.489466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489283 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feca8f75-d766-4f31-aa59-ba4ae692f026-config\") pod \"service-ca-operator-d6fc45fc5-49p5c\" (UID: \"feca8f75-d766-4f31-aa59-ba4ae692f026\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.489466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rnjn\" (UniqueName: \"kubernetes.io/projected/feca8f75-d766-4f31-aa59-ba4ae692f026-kube-api-access-5rnjn\") pod \"service-ca-operator-d6fc45fc5-49p5c\" (UID: \"feca8f75-d766-4f31-aa59-ba4ae692f026\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.489466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.489466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489367 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-trusted-ca\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.489466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489415 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a18605e-85a6-4562-acf8-4bef99990528-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2njp6\" (UID: \"3a18605e-85a6-4562-acf8-4bef99990528\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.489466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489440 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/feca8f75-d766-4f31-aa59-ba4ae692f026-serving-cert\") pod \"service-ca-operator-d6fc45fc5-49p5c\" (UID: \"feca8f75-d766-4f31-aa59-ba4ae692f026\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489465 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-certificates\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d21b7cf-8c3d-459b-a502-f049a7353d9f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489527 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxvrd\" (UniqueName: \"kubernetes.io/projected/5d21b7cf-8c3d-459b-a502-f049a7353d9f-kube-api-access-cxvrd\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489561 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-stats-auth\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489609 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-default-certificate\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489663 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-672rj\" (UniqueName: \"kubernetes.io/projected/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-kube-api-access-672rj\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.489669 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.489684 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5db89954c9-qmt95: secret "image-registry-tls" not found Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489706 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489767 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d21b7cf-8c3d-459b-a502-f049a7353d9f-service-ca-bundle\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.489782 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls podName:cb5283ff-9431-48cb-8ecc-ff6cc3c65c54 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:10.989763944 +0000 UTC m=+34.101600282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls") pod "image-registry-5db89954c9-qmt95" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54") : secret "image-registry-tls" not found Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489827 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bckbd\" (UniqueName: \"kubernetes.io/projected/3a18605e-85a6-4562-acf8-4bef99990528-kube-api-access-bckbd\") pod \"kube-storage-version-migrator-operator-6769c5d45-2njp6\" (UID: \"3a18605e-85a6-4562-acf8-4bef99990528\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489858 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5d21b7cf-8c3d-459b-a502-f049a7353d9f-snapshots\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.489887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.490515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489917 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gj5m8\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-kube-api-access-gj5m8\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.490515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489945 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-bound-sa-token\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.490515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.489973 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmstd\" (UniqueName: \"kubernetes.io/projected/7503386e-4776-4896-bd44-4ef455ac6b98-kube-api-access-hmstd\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.490515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.490002 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7503386e-4776-4896-bd44-4ef455ac6b98-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.490515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.490033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-image-registry-private-configuration\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.490515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.490061 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw5xt\" (UniqueName: \"kubernetes.io/projected/94eee2e4-7d5d-49be-ab39-13cd92cf877f-kube-api-access-hw5xt\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:10.490515 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.490222 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:10.490515 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.490264 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls podName:7503386e-4776-4896-bd44-4ef455ac6b98 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:10.990250802 +0000 UTC m=+34.102087130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9lv7p" (UID: "7503386e-4776-4896-bd44-4ef455ac6b98") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:10.491075 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.491052 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7503386e-4776-4896-bd44-4ef455ac6b98-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.491276 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.491213 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d21b7cf-8c3d-459b-a502-f049a7353d9f-tmp\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.491542 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.491519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d21b7cf-8c3d-459b-a502-f049a7353d9f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.491625 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.491607 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5d21b7cf-8c3d-459b-a502-f049a7353d9f-snapshots\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.491768 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.491748 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-trusted-ca\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.494019 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.493993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d21b7cf-8c3d-459b-a502-f049a7353d9f-serving-cert\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.494134 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.494016 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-installation-pull-secrets\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.494134 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.493999 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-image-registry-private-configuration\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.499536 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.499480 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxvrd\" (UniqueName: \"kubernetes.io/projected/5d21b7cf-8c3d-459b-a502-f049a7353d9f-kube-api-access-cxvrd\") pod \"insights-operator-585dfdc468-bzksz\" (UID: \"5d21b7cf-8c3d-459b-a502-f049a7353d9f\") " pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.500242 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.500215 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpkz2\" (UniqueName: \"kubernetes.io/projected/10fdb32e-ef53-491d-922c-4d9e4f2531f0-kube-api-access-xpkz2\") pod \"volume-data-source-validator-7c6cbb6c87-smbpl\" (UID: \"10fdb32e-ef53-491d-922c-4d9e4f2531f0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl" Apr 24 19:07:10.500343 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.500330 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmstd\" (UniqueName: \"kubernetes.io/projected/7503386e-4776-4896-bd44-4ef455ac6b98-kube-api-access-hmstd\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.500790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.500767 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-bound-sa-token\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.500872 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.500823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj5m8\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-kube-api-access-gj5m8\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.590506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-672rj\" (UniqueName: \"kubernetes.io/projected/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-kube-api-access-672rj\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.590506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590462 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.590506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590488 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bckbd\" (UniqueName: \"kubernetes.io/projected/3a18605e-85a6-4562-acf8-4bef99990528-kube-api-access-bckbd\") pod \"kube-storage-version-migrator-operator-6769c5d45-2njp6\" (UID: \"3a18605e-85a6-4562-acf8-4bef99990528\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.590784 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590522 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85baa476-8a9a-44b6-83c0-0050c6c28921-config-volume\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.590784 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590589 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw5xt\" (UniqueName: \"kubernetes.io/projected/94eee2e4-7d5d-49be-ab39-13cd92cf877f-kube-api-access-hw5xt\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:10.590784 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3d37867-8a80-4198-9320-281682c54121-trusted-ca\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.590784 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.590646 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.090618524 +0000 UTC m=+34.202454865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : configmap references non-existent config key: service-ca.crt Apr 24 19:07:10.590784 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590774 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rf5k8\" (UniqueName: \"kubernetes.io/projected/f3d37867-8a80-4198-9320-281682c54121-kube-api-access-rf5k8\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.591032 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590810 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3d37867-8a80-4198-9320-281682c54121-serving-cert\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.591032 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590841 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:10.591032 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590868 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a18605e-85a6-4562-acf8-4bef99990528-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2njp6\" (UID: \"3a18605e-85a6-4562-acf8-4bef99990528\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.591032 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.590928 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.591032 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.591003 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:07:10.591366 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.591048 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.091032937 +0000 UTC m=+34.202869275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : secret "router-metrics-certs-default" not found Apr 24 19:07:10.591366 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vd4p\" (UniqueName: \"kubernetes.io/projected/85baa476-8a9a-44b6-83c0-0050c6c28921-kube-api-access-8vd4p\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.591366 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmkhj\" (UniqueName: \"kubernetes.io/projected/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-kube-api-access-jmkhj\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:10.591366 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591172 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:10.591366 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591209 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d37867-8a80-4198-9320-281682c54121-config\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.591366 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.591281 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:10.591366 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.591339 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:07:10.591366 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.591347 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert podName:94eee2e4-7d5d-49be-ab39-13cd92cf877f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.091333966 +0000 UTC m=+34.203170299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert") pod "ingress-canary-sfgb9" (UID: "94eee2e4-7d5d-49be-ab39-13cd92cf877f") : secret "canary-serving-cert" not found Apr 24 19:07:10.591708 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591390 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz84l\" (UniqueName: \"kubernetes.io/projected/134e19f5-38b3-4160-8673-d35beeb0ed89-kube-api-access-cz84l\") pod \"network-check-source-8894fc9bd-6gmp9\" (UID: \"134e19f5-38b3-4160-8673-d35beeb0ed89\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9" Apr 24 19:07:10.591708 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feca8f75-d766-4f31-aa59-ba4ae692f026-config\") pod \"service-ca-operator-d6fc45fc5-49p5c\" (UID: \"feca8f75-d766-4f31-aa59-ba4ae692f026\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.591708 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.591460 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls podName:8e6a9b64-e53c-4f21-b0b9-61491d1bef6b nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.091445513 +0000 UTC m=+34.203281838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2rfkw" (UID: "8e6a9b64-e53c-4f21-b0b9-61491d1bef6b") : secret "samples-operator-tls" not found Apr 24 19:07:10.591708 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rnjn\" (UniqueName: \"kubernetes.io/projected/feca8f75-d766-4f31-aa59-ba4ae692f026-kube-api-access-5rnjn\") pod \"service-ca-operator-d6fc45fc5-49p5c\" (UID: \"feca8f75-d766-4f31-aa59-ba4ae692f026\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.591902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591731 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a18605e-85a6-4562-acf8-4bef99990528-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2njp6\" (UID: \"3a18605e-85a6-4562-acf8-4bef99990528\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.591902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591756 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/feca8f75-d766-4f31-aa59-ba4ae692f026-serving-cert\") pod \"service-ca-operator-d6fc45fc5-49p5c\" (UID: \"feca8f75-d766-4f31-aa59-ba4ae692f026\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.591902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-stats-auth\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.591902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591813 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.591902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591839 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85baa476-8a9a-44b6-83c0-0050c6c28921-tmp-dir\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.591902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.591869 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-default-certificate\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.592202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.592028 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a18605e-85a6-4562-acf8-4bef99990528-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2njp6\" (UID: \"3a18605e-85a6-4562-acf8-4bef99990528\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.592202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.592069 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3d37867-8a80-4198-9320-281682c54121-config\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.592202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.592148 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feca8f75-d766-4f31-aa59-ba4ae692f026-config\") pod \"service-ca-operator-d6fc45fc5-49p5c\" (UID: \"feca8f75-d766-4f31-aa59-ba4ae692f026\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.592356 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.592230 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3d37867-8a80-4198-9320-281682c54121-trusted-ca\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.594903 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.594696 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/feca8f75-d766-4f31-aa59-ba4ae692f026-serving-cert\") pod \"service-ca-operator-d6fc45fc5-49p5c\" (UID: \"feca8f75-d766-4f31-aa59-ba4ae692f026\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.595028 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.594945 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3d37867-8a80-4198-9320-281682c54121-serving-cert\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.595092 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.595077 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a18605e-85a6-4562-acf8-4bef99990528-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2njp6\" (UID: \"3a18605e-85a6-4562-acf8-4bef99990528\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.595171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.595131 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-default-certificate\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.595171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.595127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-stats-auth\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.600331 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.600308 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-672rj\" (UniqueName: \"kubernetes.io/projected/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-kube-api-access-672rj\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:10.604035 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.603983 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmkhj\" (UniqueName: \"kubernetes.io/projected/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-kube-api-access-jmkhj\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:10.604838 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.604809 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rnjn\" (UniqueName: \"kubernetes.io/projected/feca8f75-d766-4f31-aa59-ba4ae692f026-kube-api-access-5rnjn\") pod \"service-ca-operator-d6fc45fc5-49p5c\" (UID: \"feca8f75-d766-4f31-aa59-ba4ae692f026\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.606227 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.606182 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz84l\" (UniqueName: \"kubernetes.io/projected/134e19f5-38b3-4160-8673-d35beeb0ed89-kube-api-access-cz84l\") pod \"network-check-source-8894fc9bd-6gmp9\" (UID: \"134e19f5-38b3-4160-8673-d35beeb0ed89\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9" Apr 24 19:07:10.606515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.606491 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bckbd\" (UniqueName: \"kubernetes.io/projected/3a18605e-85a6-4562-acf8-4bef99990528-kube-api-access-bckbd\") pod \"kube-storage-version-migrator-operator-6769c5d45-2njp6\" (UID: \"3a18605e-85a6-4562-acf8-4bef99990528\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.606935 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.606915 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf5k8\" (UniqueName: \"kubernetes.io/projected/f3d37867-8a80-4198-9320-281682c54121-kube-api-access-rf5k8\") pod \"console-operator-9d4b6777b-zsl4c\" (UID: \"f3d37867-8a80-4198-9320-281682c54121\") " pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.607596 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.607579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw5xt\" (UniqueName: \"kubernetes.io/projected/94eee2e4-7d5d-49be-ab39-13cd92cf877f-kube-api-access-hw5xt\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:10.618055 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.618030 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-bzksz" Apr 24 19:07:10.625886 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.625863 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl" Apr 24 19:07:10.649936 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.649901 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" Apr 24 19:07:10.657807 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.657781 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:10.677652 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.677621 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9" Apr 24 19:07:10.686385 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.686362 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" Apr 24 19:07:10.693314 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.693287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd4p\" (UniqueName: \"kubernetes.io/projected/85baa476-8a9a-44b6-83c0-0050c6c28921-kube-api-access-8vd4p\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.693472 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.693454 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.693534 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.693481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85baa476-8a9a-44b6-83c0-0050c6c28921-tmp-dir\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.693589 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.693531 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85baa476-8a9a-44b6-83c0-0050c6c28921-config-volume\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.693646 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.693610 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:10.693700 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.693692 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls podName:85baa476-8a9a-44b6-83c0-0050c6c28921 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.193670277 +0000 UTC m=+34.305506608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls") pod "dns-default-qxfsm" (UID: "85baa476-8a9a-44b6-83c0-0050c6c28921") : secret "dns-default-metrics-tls" not found Apr 24 19:07:10.693874 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.693853 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85baa476-8a9a-44b6-83c0-0050c6c28921-tmp-dir\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.695015 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.694997 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85baa476-8a9a-44b6-83c0-0050c6c28921-config-volume\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.714833 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.714808 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vd4p\" (UniqueName: \"kubernetes.io/projected/85baa476-8a9a-44b6-83c0-0050c6c28921-kube-api-access-8vd4p\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:10.802749 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.802716 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-r676m"] Apr 24 19:07:10.806562 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.806539 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:10.809278 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.809257 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-fpmcl\"" Apr 24 19:07:10.895432 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.895396 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c4b8f1da-e016-4496-a12c-19572f7ba9ad-hosts-file\") pod \"node-resolver-r676m\" (UID: \"c4b8f1da-e016-4496-a12c-19572f7ba9ad\") " pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:10.895635 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.895463 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4b8f1da-e016-4496-a12c-19572f7ba9ad-tmp-dir\") pod \"node-resolver-r676m\" (UID: \"c4b8f1da-e016-4496-a12c-19572f7ba9ad\") " pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:10.895689 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.895657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr4qv\" (UniqueName: \"kubernetes.io/projected/c4b8f1da-e016-4496-a12c-19572f7ba9ad-kube-api-access-qr4qv\") pod \"node-resolver-r676m\" (UID: \"c4b8f1da-e016-4496-a12c-19572f7ba9ad\") " pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:10.996649 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.996611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qr4qv\" (UniqueName: \"kubernetes.io/projected/c4b8f1da-e016-4496-a12c-19572f7ba9ad-kube-api-access-qr4qv\") pod \"node-resolver-r676m\" (UID: \"c4b8f1da-e016-4496-a12c-19572f7ba9ad\") " pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:10.996854 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.996665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:10.996854 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.996711 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c4b8f1da-e016-4496-a12c-19572f7ba9ad-hosts-file\") pod \"node-resolver-r676m\" (UID: \"c4b8f1da-e016-4496-a12c-19572f7ba9ad\") " pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:10.996854 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.996744 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4b8f1da-e016-4496-a12c-19572f7ba9ad-tmp-dir\") pod \"node-resolver-r676m\" (UID: \"c4b8f1da-e016-4496-a12c-19572f7ba9ad\") " pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:10.996854 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.996777 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:10.997127 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.996856 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:10.997127 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.996874 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5db89954c9-qmt95: secret "image-registry-tls" not found Apr 24 19:07:10.997127 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.996897 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:10.997127 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.996921 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c4b8f1da-e016-4496-a12c-19572f7ba9ad-hosts-file\") pod \"node-resolver-r676m\" (UID: \"c4b8f1da-e016-4496-a12c-19572f7ba9ad\") " pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:10.997127 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.996937 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls podName:cb5283ff-9431-48cb-8ecc-ff6cc3c65c54 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.996917727 +0000 UTC m=+35.108754066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls") pod "image-registry-5db89954c9-qmt95" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54") : secret "image-registry-tls" not found Apr 24 19:07:10.997127 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:10.996989 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls podName:7503386e-4776-4896-bd44-4ef455ac6b98 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.996970146 +0000 UTC m=+35.108806480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9lv7p" (UID: "7503386e-4776-4896-bd44-4ef455ac6b98") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:10.997392 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:10.997157 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4b8f1da-e016-4496-a12c-19572f7ba9ad-tmp-dir\") pod \"node-resolver-r676m\" (UID: \"c4b8f1da-e016-4496-a12c-19572f7ba9ad\") " pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:11.008489 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.008454 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr4qv\" (UniqueName: \"kubernetes.io/projected/c4b8f1da-e016-4496-a12c-19572f7ba9ad-kube-api-access-qr4qv\") pod \"node-resolver-r676m\" (UID: \"c4b8f1da-e016-4496-a12c-19572f7ba9ad\") " pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:11.097823 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.097770 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:11.098013 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.097842 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:11.098013 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.097886 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:11.098013 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.097937 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:07:11.098013 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.097970 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:11.098013 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.098015 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls podName:8e6a9b64-e53c-4f21-b0b9-61491d1bef6b nodeName:}" failed. No retries permitted until 2026-04-24 19:07:12.097991291 +0000 UTC m=+35.209827627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2rfkw" (UID: "8e6a9b64-e53c-4f21-b0b9-61491d1bef6b") : secret "samples-operator-tls" not found Apr 24 19:07:11.098303 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.098017 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:07:11.098303 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.098077 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:12.098059993 +0000 UTC m=+35.209896332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : secret "router-metrics-certs-default" not found Apr 24 19:07:11.098303 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.098096 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:12.098087149 +0000 UTC m=+35.209923479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : configmap references non-existent config key: service-ca.crt Apr 24 19:07:11.098303 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.098093 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:11.098303 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.098152 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert podName:94eee2e4-7d5d-49be-ab39-13cd92cf877f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:12.098141878 +0000 UTC m=+35.209978203 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert") pod "ingress-canary-sfgb9" (UID: "94eee2e4-7d5d-49be-ab39-13cd92cf877f") : secret "canary-serving-cert" not found Apr 24 19:07:11.117052 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.117008 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r676m" Apr 24 19:07:11.199088 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.198985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:11.199088 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.199034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:11.199342 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.199142 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:11.199342 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.199202 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:07:11.199342 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.199239 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:11.199342 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.199267 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs podName:76172245-47dc-4f2f-90c9-d345a816e233 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:43.199248492 +0000 UTC m=+66.311084831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs") pod "network-metrics-daemon-nghhh" (UID: "76172245-47dc-4f2f-90c9-d345a816e233") : secret "metrics-daemon-secret" not found Apr 24 19:07:11.199342 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:11.199286 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls podName:85baa476-8a9a-44b6-83c0-0050c6c28921 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:12.199275673 +0000 UTC m=+35.311112000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls") pod "dns-default-qxfsm" (UID: "85baa476-8a9a-44b6-83c0-0050c6c28921") : secret "dns-default-metrics-tls" not found Apr 24 19:07:11.202221 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.202199 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e583cac8-fcbc-4fa3-a2f4-d8b1fad99146-original-pull-secret\") pod \"global-pull-secret-syncer-dns5q\" (UID: \"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146\") " pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:11.299960 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.299914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dhx\" (UniqueName: \"kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx\") pod \"network-check-target-nlzd4\" (UID: \"22f62d88-7d18-4fc4-a8b1-44efd0814325\") " pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:11.302726 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.302698 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2dhx\" (UniqueName: \"kubernetes.io/projected/22f62d88-7d18-4fc4-a8b1-44efd0814325-kube-api-access-x2dhx\") pod \"network-check-target-nlzd4\" (UID: \"22f62d88-7d18-4fc4-a8b1-44efd0814325\") " pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:11.366352 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.366309 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dns5q" Apr 24 19:07:11.380189 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.380161 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:11.618337 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.618021 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r676m" event={"ID":"c4b8f1da-e016-4496-a12c-19572f7ba9ad","Type":"ContainerStarted","Data":"c62a9954706f7609551abd7932439378abff7061710e08d86ed7049ceb0c6a44"} Apr 24 19:07:11.697503 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.696726 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9"] Apr 24 19:07:11.714922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.714895 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl"] Apr 24 19:07:11.720522 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.720497 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6"] Apr 24 19:07:11.724144 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:11.724120 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10fdb32e_ef53_491d_922c_4d9e4f2531f0.slice/crio-f3bee2b52434f4121632b3573a47c89d2444244a9caa63f46e73802569e44dab WatchSource:0}: Error finding container f3bee2b52434f4121632b3573a47c89d2444244a9caa63f46e73802569e44dab: Status 404 returned error can't find the container with id f3bee2b52434f4121632b3573a47c89d2444244a9caa63f46e73802569e44dab Apr 24 19:07:11.724817 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:11.724776 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a18605e_85a6_4562_acf8_4bef99990528.slice/crio-58a1b34127bb638c3e749231ee347e120425b4738029bae257fef50a52324f77 WatchSource:0}: Error finding container 58a1b34127bb638c3e749231ee347e120425b4738029bae257fef50a52324f77: Status 404 returned error can't find the container with id 58a1b34127bb638c3e749231ee347e120425b4738029bae257fef50a52324f77 Apr 24 19:07:11.730050 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.729901 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nlzd4"] Apr 24 19:07:11.731352 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.731303 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-bzksz"] Apr 24 19:07:11.735381 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.735285 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c"] Apr 24 19:07:11.739910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.738680 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-zsl4c"] Apr 24 19:07:11.739910 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:11.738955 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22f62d88_7d18_4fc4_a8b1_44efd0814325.slice/crio-a7b600d3846894a9bcc65f2ba57994638109d61d0895eaab53f49eb508abb55e WatchSource:0}: Error finding container a7b600d3846894a9bcc65f2ba57994638109d61d0895eaab53f49eb508abb55e: Status 404 returned error can't find the container with id a7b600d3846894a9bcc65f2ba57994638109d61d0895eaab53f49eb508abb55e Apr 24 19:07:11.744716 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:11.744548 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3d37867_8a80_4198_9320_281682c54121.slice/crio-a98a039ef40ca9ada9bc8ac3fb70bbedd023e0d58be26fb20ec87f092c118de2 WatchSource:0}: Error finding container a98a039ef40ca9ada9bc8ac3fb70bbedd023e0d58be26fb20ec87f092c118de2: Status 404 returned error can't find the container with id a98a039ef40ca9ada9bc8ac3fb70bbedd023e0d58be26fb20ec87f092c118de2 Apr 24 19:07:11.746133 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:11.746018 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dns5q"] Apr 24 19:07:11.749975 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:11.749931 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode583cac8_fcbc_4fa3_a2f4_d8b1fad99146.slice/crio-89099ee5e26d1abb4d7df726ec19654c031253a0fa150987942c49862ed43a35 WatchSource:0}: Error finding container 89099ee5e26d1abb4d7df726ec19654c031253a0fa150987942c49862ed43a35: Status 404 returned error can't find the container with id 89099ee5e26d1abb4d7df726ec19654c031253a0fa150987942c49862ed43a35 Apr 24 19:07:12.008351 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.008166 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:12.008506 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.008311 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:12.008506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.008435 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:12.008506 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.008490 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls podName:7503386e-4776-4896-bd44-4ef455ac6b98 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:14.008468979 +0000 UTC m=+37.120305307 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9lv7p" (UID: "7503386e-4776-4896-bd44-4ef455ac6b98") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:12.008625 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.008535 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:12.008625 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.008546 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5db89954c9-qmt95: secret "image-registry-tls" not found Apr 24 19:07:12.008625 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.008588 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls podName:cb5283ff-9431-48cb-8ecc-ff6cc3c65c54 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:14.008576324 +0000 UTC m=+37.120412648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls") pod "image-registry-5db89954c9-qmt95" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54") : secret "image-registry-tls" not found Apr 24 19:07:12.109129 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.109073 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:12.109372 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.109180 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:12.109372 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.109213 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:12.109372 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.109237 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:14.10921884 +0000 UTC m=+37.221055169 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : configmap references non-existent config key: service-ca.crt Apr 24 19:07:12.109372 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.109300 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:07:12.109372 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.109316 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:07:12.109372 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.109350 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:12.109372 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.109301 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:12.109372 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.109352 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:14.109336829 +0000 UTC m=+37.221173168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : secret "router-metrics-certs-default" not found Apr 24 19:07:12.109652 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.109395 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls podName:8e6a9b64-e53c-4f21-b0b9-61491d1bef6b nodeName:}" failed. No retries permitted until 2026-04-24 19:07:14.109384126 +0000 UTC m=+37.221220454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2rfkw" (UID: "8e6a9b64-e53c-4f21-b0b9-61491d1bef6b") : secret "samples-operator-tls" not found Apr 24 19:07:12.109652 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.109408 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert podName:94eee2e4-7d5d-49be-ab39-13cd92cf877f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:14.109401832 +0000 UTC m=+37.221238155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert") pod "ingress-canary-sfgb9" (UID: "94eee2e4-7d5d-49be-ab39-13cd92cf877f") : secret "canary-serving-cert" not found Apr 24 19:07:12.210547 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.210509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:12.210771 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.210622 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:12.210771 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:12.210687 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls podName:85baa476-8a9a-44b6-83c0-0050c6c28921 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:14.21066575 +0000 UTC m=+37.322502087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls") pod "dns-default-qxfsm" (UID: "85baa476-8a9a-44b6-83c0-0050c6c28921") : secret "dns-default-metrics-tls" not found Apr 24 19:07:12.624030 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.623957 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nlzd4" event={"ID":"22f62d88-7d18-4fc4-a8b1-44efd0814325","Type":"ContainerStarted","Data":"a7b600d3846894a9bcc65f2ba57994638109d61d0895eaab53f49eb508abb55e"} Apr 24 19:07:12.626638 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.626539 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r676m" event={"ID":"c4b8f1da-e016-4496-a12c-19572f7ba9ad","Type":"ContainerStarted","Data":"c5f901b581d4f4e32823fb72d75d6b44395f5c8c2a04fcc1f7fc7f3302f5cc2c"} Apr 24 19:07:12.628198 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.628148 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9" event={"ID":"134e19f5-38b3-4160-8673-d35beeb0ed89","Type":"ContainerStarted","Data":"d6c02c7852fa3a813a95cb2f0c40f27376e7c09c17fa505c32a80a22798d386a"} Apr 24 19:07:12.630389 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.630321 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dns5q" event={"ID":"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146","Type":"ContainerStarted","Data":"89099ee5e26d1abb4d7df726ec19654c031253a0fa150987942c49862ed43a35"} Apr 24 19:07:12.633883 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.633821 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" event={"ID":"feca8f75-d766-4f31-aa59-ba4ae692f026","Type":"ContainerStarted","Data":"69f253148f1cbe58b0470199372ba6e7538957cb47b5eacef01c9596b560e5b6"} Apr 24 19:07:12.635343 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.635312 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" event={"ID":"f3d37867-8a80-4198-9320-281682c54121","Type":"ContainerStarted","Data":"a98a039ef40ca9ada9bc8ac3fb70bbedd023e0d58be26fb20ec87f092c118de2"} Apr 24 19:07:12.641207 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.640473 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc00a2b3-d877-4828-bc79-de040ea70887" containerID="00f705edcfa0b8068b5d11aa881d844c8a3130f09199f498cdff99f95111759f" exitCode=0 Apr 24 19:07:12.641207 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.640542 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gbmct" event={"ID":"bc00a2b3-d877-4828-bc79-de040ea70887","Type":"ContainerDied","Data":"00f705edcfa0b8068b5d11aa881d844c8a3130f09199f498cdff99f95111759f"} Apr 24 19:07:12.647046 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.645979 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bzksz" event={"ID":"5d21b7cf-8c3d-459b-a502-f049a7353d9f","Type":"ContainerStarted","Data":"176c86cd58f8c72a37bd31ea467337cf50295fc223b41f9385d497f9233d1c1e"} Apr 24 19:07:12.647046 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.646642 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r676m" podStartSLOduration=2.6466254019999997 podStartE2EDuration="2.646625402s" podCreationTimestamp="2026-04-24 19:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:12.645530682 +0000 UTC m=+35.757367032" watchObservedRunningTime="2026-04-24 19:07:12.646625402 +0000 UTC m=+35.758461750" Apr 24 19:07:12.649178 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.649090 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" event={"ID":"3a18605e-85a6-4562-acf8-4bef99990528","Type":"ContainerStarted","Data":"58a1b34127bb638c3e749231ee347e120425b4738029bae257fef50a52324f77"} Apr 24 19:07:12.653350 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:12.653299 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl" event={"ID":"10fdb32e-ef53-491d-922c-4d9e4f2531f0","Type":"ContainerStarted","Data":"f3bee2b52434f4121632b3573a47c89d2444244a9caa63f46e73802569e44dab"} Apr 24 19:07:13.667255 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:13.666344 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc00a2b3-d877-4828-bc79-de040ea70887" containerID="8fa5ef6c07210c9a8e8dcb50a0bed11ea0adc15a799955685c43b1ce9e29d882" exitCode=0 Apr 24 19:07:13.667255 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:13.667156 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gbmct" event={"ID":"bc00a2b3-d877-4828-bc79-de040ea70887","Type":"ContainerDied","Data":"8fa5ef6c07210c9a8e8dcb50a0bed11ea0adc15a799955685c43b1ce9e29d882"} Apr 24 19:07:14.030704 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:14.030669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:14.030815 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:14.030764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:14.030908 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.030893 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:14.031009 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.030965 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls podName:7503386e-4776-4896-bd44-4ef455ac6b98 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.030944749 +0000 UTC m=+41.142781079 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9lv7p" (UID: "7503386e-4776-4896-bd44-4ef455ac6b98") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:14.031571 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.031419 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:14.031571 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.031440 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5db89954c9-qmt95: secret "image-registry-tls" not found Apr 24 19:07:14.031571 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.031485 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls podName:cb5283ff-9431-48cb-8ecc-ff6cc3c65c54 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.031469818 +0000 UTC m=+41.143306143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls") pod "image-registry-5db89954c9-qmt95" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54") : secret "image-registry-tls" not found Apr 24 19:07:14.132156 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:14.132036 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:14.132925 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.132508 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.132486128 +0000 UTC m=+41.244322467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : configmap references non-existent config key: service-ca.crt Apr 24 19:07:14.133942 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:14.133158 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:14.133942 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:14.133290 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:14.133942 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:14.133339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:14.133942 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.133640 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:14.133942 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.133737 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert podName:94eee2e4-7d5d-49be-ab39-13cd92cf877f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.133720748 +0000 UTC m=+41.245557079 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert") pod "ingress-canary-sfgb9" (UID: "94eee2e4-7d5d-49be-ab39-13cd92cf877f") : secret "canary-serving-cert" not found Apr 24 19:07:14.133942 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.133803 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:07:14.133942 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.133840 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls podName:8e6a9b64-e53c-4f21-b0b9-61491d1bef6b nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.133827038 +0000 UTC m=+41.245663367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2rfkw" (UID: "8e6a9b64-e53c-4f21-b0b9-61491d1bef6b") : secret "samples-operator-tls" not found Apr 24 19:07:14.133942 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.133890 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:07:14.133942 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.133920 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.133910504 +0000 UTC m=+41.245746835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : secret "router-metrics-certs-default" not found Apr 24 19:07:14.235164 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:14.234978 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:14.235389 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.235190 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:14.235389 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:14.235264 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls podName:85baa476-8a9a-44b6-83c0-0050c6c28921 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.235241773 +0000 UTC m=+41.347078101 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls") pod "dns-default-qxfsm" (UID: "85baa476-8a9a-44b6-83c0-0050c6c28921") : secret "dns-default-metrics-tls" not found Apr 24 19:07:18.070593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:18.070552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:18.071080 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:18.070625 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:18.071080 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.070711 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:18.071080 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.070738 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5db89954c9-qmt95: secret "image-registry-tls" not found Apr 24 19:07:18.071080 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.070801 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls podName:cb5283ff-9431-48cb-8ecc-ff6cc3c65c54 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.07078472 +0000 UTC m=+49.182621044 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls") pod "image-registry-5db89954c9-qmt95" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54") : secret "image-registry-tls" not found Apr 24 19:07:18.071080 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.070720 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:18.071080 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.070850 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls podName:7503386e-4776-4896-bd44-4ef455ac6b98 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.070838421 +0000 UTC m=+49.182674745 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9lv7p" (UID: "7503386e-4776-4896-bd44-4ef455ac6b98") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:18.171873 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:18.171835 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:18.172086 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:18.171903 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:18.172086 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:18.171931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:18.172086 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:18.171957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:18.172086 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.172028 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.172003191 +0000 UTC m=+49.283839534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : configmap references non-existent config key: service-ca.crt Apr 24 19:07:18.172086 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.172046 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:07:18.172086 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.172060 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:18.172345 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.172116 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert podName:94eee2e4-7d5d-49be-ab39-13cd92cf877f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.172090311 +0000 UTC m=+49.283926637 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert") pod "ingress-canary-sfgb9" (UID: "94eee2e4-7d5d-49be-ab39-13cd92cf877f") : secret "canary-serving-cert" not found Apr 24 19:07:18.172345 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.172060 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:07:18.172345 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.172129 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls podName:8e6a9b64-e53c-4f21-b0b9-61491d1bef6b nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.172123017 +0000 UTC m=+49.283959340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2rfkw" (UID: "8e6a9b64-e53c-4f21-b0b9-61491d1bef6b") : secret "samples-operator-tls" not found Apr 24 19:07:18.172345 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.172157 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.172145044 +0000 UTC m=+49.283981382 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : secret "router-metrics-certs-default" not found Apr 24 19:07:18.273457 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:18.273410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:18.273686 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.273542 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:18.273686 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:18.273628 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls podName:85baa476-8a9a-44b6-83c0-0050c6c28921 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.273606804 +0000 UTC m=+49.385443134 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls") pod "dns-default-qxfsm" (UID: "85baa476-8a9a-44b6-83c0-0050c6c28921") : secret "dns-default-metrics-tls" not found Apr 24 19:07:21.687218 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.687163 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nlzd4" event={"ID":"22f62d88-7d18-4fc4-a8b1-44efd0814325","Type":"ContainerStarted","Data":"44eef673b13a8f09d0cc2d9900ea1df63adc76678ef8f287fdd275ab13446f27"} Apr 24 19:07:21.687729 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.687262 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:21.689295 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.689083 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9" event={"ID":"134e19f5-38b3-4160-8673-d35beeb0ed89","Type":"ContainerStarted","Data":"1bd7f60055963a73e7431e42509844458a9e6f465bb4dec38d0a7077eb87f51c"} Apr 24 19:07:21.690744 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.690712 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dns5q" event={"ID":"e583cac8-fcbc-4fa3-a2f4-d8b1fad99146","Type":"ContainerStarted","Data":"2c124e3fdb9b28420f6a637cb7ebfea12f73974c0ff87e6b984813a9df6c4c16"} Apr 24 19:07:21.692134 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.692094 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" event={"ID":"feca8f75-d766-4f31-aa59-ba4ae692f026","Type":"ContainerStarted","Data":"c9e277f24a851a0e87bfd471c190f4586104c224cdf9f06a7bbd33a84823fbca"} Apr 24 19:07:21.693554 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.693534 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/0.log" Apr 24 19:07:21.693666 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.693575 2568 generic.go:358] "Generic (PLEG): container finished" podID="f3d37867-8a80-4198-9320-281682c54121" containerID="c7c2bfcf0be78b9665a2e5b1c7fd5127ce9141c248864f1139acc1068ba0c7d6" exitCode=255 Apr 24 19:07:21.693666 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.693645 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" event={"ID":"f3d37867-8a80-4198-9320-281682c54121","Type":"ContainerDied","Data":"c7c2bfcf0be78b9665a2e5b1c7fd5127ce9141c248864f1139acc1068ba0c7d6"} Apr 24 19:07:21.693826 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.693810 2568 scope.go:117] "RemoveContainer" containerID="c7c2bfcf0be78b9665a2e5b1c7fd5127ce9141c248864f1139acc1068ba0c7d6" Apr 24 19:07:21.697041 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.697018 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gbmct" event={"ID":"bc00a2b3-d877-4828-bc79-de040ea70887","Type":"ContainerStarted","Data":"f2cdf417c242127d581774f33d54d2a390894e38b3edf01fa113df80896fd957"} Apr 24 19:07:21.699004 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.698979 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bzksz" event={"ID":"5d21b7cf-8c3d-459b-a502-f049a7353d9f","Type":"ContainerStarted","Data":"32fedfadf6a5107e9ff89e04813dd384af907cef9c715f861039fefc8eacdeab"} Apr 24 19:07:21.700387 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.700364 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" event={"ID":"3a18605e-85a6-4562-acf8-4bef99990528","Type":"ContainerStarted","Data":"b9e2e9c87b758ba054515e3aa5fcef49d128a97b6de9eaff9bc566a7bf7a70c4"} Apr 24 19:07:21.701760 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.701736 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl" event={"ID":"10fdb32e-ef53-491d-922c-4d9e4f2531f0","Type":"ContainerStarted","Data":"84b6bd5cde2046755fc17781309eeeb0f4c8a38286d82dad1f7bde696f5a01b0"} Apr 24 19:07:21.707608 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.707506 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nlzd4" podStartSLOduration=35.713158991 podStartE2EDuration="44.707491436s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:07:11.742597326 +0000 UTC m=+34.854433654" lastFinishedPulling="2026-04-24 19:07:20.73692976 +0000 UTC m=+43.848766099" observedRunningTime="2026-04-24 19:07:21.707064905 +0000 UTC m=+44.818901252" watchObservedRunningTime="2026-04-24 19:07:21.707491436 +0000 UTC m=+44.819327783" Apr 24 19:07:21.724459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.723522 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gmp9" podStartSLOduration=13.710107825 podStartE2EDuration="22.723503112s" podCreationTimestamp="2026-04-24 19:06:59 +0000 UTC" firstStartedPulling="2026-04-24 19:07:11.712380384 +0000 UTC m=+34.824216721" lastFinishedPulling="2026-04-24 19:07:20.725775678 +0000 UTC m=+43.837612008" observedRunningTime="2026-04-24 19:07:21.722404477 +0000 UTC m=+44.834240827" watchObservedRunningTime="2026-04-24 19:07:21.723503112 +0000 UTC m=+44.835339460" Apr 24 19:07:21.780726 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.775170 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gbmct" podStartSLOduration=13.215660661 podStartE2EDuration="44.775144495s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.154391712 +0000 UTC m=+3.266228052" lastFinishedPulling="2026-04-24 19:07:11.713875563 +0000 UTC m=+34.825711886" observedRunningTime="2026-04-24 19:07:21.773916584 +0000 UTC m=+44.885752931" watchObservedRunningTime="2026-04-24 19:07:21.775144495 +0000 UTC m=+44.886980845" Apr 24 19:07:21.780726 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.775823 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" podStartSLOduration=18.776936116 podStartE2EDuration="27.77581266s" podCreationTimestamp="2026-04-24 19:06:54 +0000 UTC" firstStartedPulling="2026-04-24 19:07:11.72648847 +0000 UTC m=+34.838324794" lastFinishedPulling="2026-04-24 19:07:20.725365012 +0000 UTC m=+43.837201338" observedRunningTime="2026-04-24 19:07:21.745815373 +0000 UTC m=+44.857651720" watchObservedRunningTime="2026-04-24 19:07:21.77581266 +0000 UTC m=+44.887649007" Apr 24 19:07:21.791264 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.791215 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-smbpl" podStartSLOduration=28.835905633 podStartE2EDuration="37.791197375s" podCreationTimestamp="2026-04-24 19:06:44 +0000 UTC" firstStartedPulling="2026-04-24 19:07:11.726390011 +0000 UTC m=+34.838226336" lastFinishedPulling="2026-04-24 19:07:20.681681754 +0000 UTC m=+43.793518078" observedRunningTime="2026-04-24 19:07:21.790953449 +0000 UTC m=+44.902789791" watchObservedRunningTime="2026-04-24 19:07:21.791197375 +0000 UTC m=+44.903033721" Apr 24 19:07:21.828779 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.828721 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" podStartSLOduration=19.849232787 podStartE2EDuration="28.828703137s" podCreationTimestamp="2026-04-24 19:06:53 +0000 UTC" firstStartedPulling="2026-04-24 19:07:11.747597951 +0000 UTC m=+34.859434284" lastFinishedPulling="2026-04-24 19:07:20.727068306 +0000 UTC m=+43.838904634" observedRunningTime="2026-04-24 19:07:21.828242973 +0000 UTC m=+44.940079324" watchObservedRunningTime="2026-04-24 19:07:21.828703137 +0000 UTC m=+44.940539483" Apr 24 19:07:21.874042 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.873981 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dns5q" podStartSLOduration=34.676045211 podStartE2EDuration="43.873958548s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:07:11.752117962 +0000 UTC m=+34.863954286" lastFinishedPulling="2026-04-24 19:07:20.950031287 +0000 UTC m=+44.061867623" observedRunningTime="2026-04-24 19:07:21.854230897 +0000 UTC m=+44.966067244" watchObservedRunningTime="2026-04-24 19:07:21.873958548 +0000 UTC m=+44.985794896" Apr 24 19:07:21.874651 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:21.874615 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-bzksz" podStartSLOduration=29.893812049 podStartE2EDuration="38.874603728s" podCreationTimestamp="2026-04-24 19:06:43 +0000 UTC" firstStartedPulling="2026-04-24 19:07:11.746640188 +0000 UTC m=+34.858476512" lastFinishedPulling="2026-04-24 19:07:20.727431849 +0000 UTC m=+43.839268191" observedRunningTime="2026-04-24 19:07:21.872549846 +0000 UTC m=+44.984386193" watchObservedRunningTime="2026-04-24 19:07:21.874603728 +0000 UTC m=+44.986440075" Apr 24 19:07:22.705849 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:22.705816 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:07:22.706313 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:22.706273 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/0.log" Apr 24 19:07:22.706379 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:22.706314 2568 generic.go:358] "Generic (PLEG): container finished" podID="f3d37867-8a80-4198-9320-281682c54121" containerID="4f6a7eb0b8e15ae155728c1c41438be75ae29bb97fdc3b2abce4e381f2b6e1b4" exitCode=255 Apr 24 19:07:22.706492 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:22.706464 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" event={"ID":"f3d37867-8a80-4198-9320-281682c54121","Type":"ContainerDied","Data":"4f6a7eb0b8e15ae155728c1c41438be75ae29bb97fdc3b2abce4e381f2b6e1b4"} Apr 24 19:07:22.706604 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:22.706589 2568 scope.go:117] "RemoveContainer" containerID="c7c2bfcf0be78b9665a2e5b1c7fd5127ce9141c248864f1139acc1068ba0c7d6" Apr 24 19:07:22.706753 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:22.706729 2568 scope.go:117] "RemoveContainer" containerID="4f6a7eb0b8e15ae155728c1c41438be75ae29bb97fdc3b2abce4e381f2b6e1b4" Apr 24 19:07:22.707075 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:22.707055 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zsl4c_openshift-console-operator(f3d37867-8a80-4198-9320-281682c54121)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" podUID="f3d37867-8a80-4198-9320-281682c54121" Apr 24 19:07:23.710026 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:23.709995 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:07:23.710435 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:23.710358 2568 scope.go:117] "RemoveContainer" containerID="4f6a7eb0b8e15ae155728c1c41438be75ae29bb97fdc3b2abce4e381f2b6e1b4" Apr 24 19:07:23.710531 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:23.710515 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zsl4c_openshift-console-operator(f3d37867-8a80-4198-9320-281682c54121)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" podUID="f3d37867-8a80-4198-9320-281682c54121" Apr 24 19:07:25.156417 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:25.156391 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r676m_c4b8f1da-e016-4496-a12c-19572f7ba9ad/dns-node-resolver/0.log" Apr 24 19:07:25.559422 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:25.559346 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-29rc7_254ea4ca-f9d7-452a-9868-bdf3ef96512c/node-ca/0.log" Apr 24 19:07:26.153543 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:26.153497 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:26.153747 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:26.153592 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:26.153747 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.153640 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:26.153747 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.153662 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5db89954c9-qmt95: secret "image-registry-tls" not found Apr 24 19:07:26.153747 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.153719 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls podName:cb5283ff-9431-48cb-8ecc-ff6cc3c65c54 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:42.153703407 +0000 UTC m=+65.265539732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls") pod "image-registry-5db89954c9-qmt95" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54") : secret "image-registry-tls" not found Apr 24 19:07:26.153747 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.153727 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:26.153942 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.153788 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls podName:7503386e-4776-4896-bd44-4ef455ac6b98 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:42.153774971 +0000 UTC m=+65.265611298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9lv7p" (UID: "7503386e-4776-4896-bd44-4ef455ac6b98") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:07:26.254181 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:26.254141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:26.254189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.254281 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.254304 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:26.254320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.254337 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:42.25432249 +0000 UTC m=+65.366158813 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : secret "router-metrics-certs-default" not found Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.254383 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls podName:8e6a9b64-e53c-4f21-b0b9-61491d1bef6b nodeName:}" failed. No retries permitted until 2026-04-24 19:07:42.254365432 +0000 UTC m=+65.366201759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2rfkw" (UID: "8e6a9b64-e53c-4f21-b0b9-61491d1bef6b") : secret "samples-operator-tls" not found Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.254393 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.254431 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert podName:94eee2e4-7d5d-49be-ab39-13cd92cf877f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:42.254419566 +0000 UTC m=+65.366255898 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert") pod "ingress-canary-sfgb9" (UID: "94eee2e4-7d5d-49be-ab39-13cd92cf877f") : secret "canary-serving-cert" not found Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:26.254485 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:26.254636 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.254611 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle podName:b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:42.254599949 +0000 UTC m=+65.366436273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle") pod "router-default-5dd48cf8b4-xldzq" (UID: "b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e") : configmap references non-existent config key: service-ca.crt Apr 24 19:07:26.355373 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:26.355322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:26.355544 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.355461 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:26.355544 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:26.355524 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls podName:85baa476-8a9a-44b6-83c0-0050c6c28921 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:42.355508539 +0000 UTC m=+65.467344862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls") pod "dns-default-qxfsm" (UID: "85baa476-8a9a-44b6-83c0-0050c6c28921") : secret "dns-default-metrics-tls" not found Apr 24 19:07:30.657880 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:30.657844 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:30.658305 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:30.657991 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:30.658305 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:30.658285 2568 scope.go:117] "RemoveContainer" containerID="4f6a7eb0b8e15ae155728c1c41438be75ae29bb97fdc3b2abce4e381f2b6e1b4" Apr 24 19:07:30.658494 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:30.658476 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zsl4c_openshift-console-operator(f3d37867-8a80-4198-9320-281682c54121)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" podUID="f3d37867-8a80-4198-9320-281682c54121" Apr 24 19:07:30.729839 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:30.729813 2568 scope.go:117] "RemoveContainer" containerID="4f6a7eb0b8e15ae155728c1c41438be75ae29bb97fdc3b2abce4e381f2b6e1b4" Apr 24 19:07:30.730006 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:30.729988 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zsl4c_openshift-console-operator(f3d37867-8a80-4198-9320-281682c54121)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" podUID="f3d37867-8a80-4198-9320-281682c54121" Apr 24 19:07:42.190613 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.190569 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:42.191068 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.190641 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:42.194271 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.194240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"image-registry-5db89954c9-qmt95\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:42.194410 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.194386 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7503386e-4776-4896-bd44-4ef455ac6b98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9lv7p\" (UID: \"7503386e-4776-4896-bd44-4ef455ac6b98\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:42.291677 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.291637 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:42.291677 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.291682 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:42.291936 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.291721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:42.291936 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.291782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:42.292424 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.292399 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-service-ca-bundle\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:42.294088 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.294069 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e-metrics-certs\") pod \"router-default-5dd48cf8b4-xldzq\" (UID: \"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e\") " pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:42.294154 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.294087 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e6a9b64-e53c-4f21-b0b9-61491d1bef6b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2rfkw\" (UID: \"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:42.309512 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.309485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94eee2e4-7d5d-49be-ab39-13cd92cf877f-cert\") pod \"ingress-canary-sfgb9\" (UID: \"94eee2e4-7d5d-49be-ab39-13cd92cf877f\") " pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:42.392925 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.392876 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:42.395177 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.395156 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85baa476-8a9a-44b6-83c0-0050c6c28921-metrics-tls\") pod \"dns-default-qxfsm\" (UID: \"85baa476-8a9a-44b6-83c0-0050c6c28921\") " pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:42.411980 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.411956 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-czn8s\"" Apr 24 19:07:42.419933 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.419907 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:42.437605 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.437582 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nnzs7\"" Apr 24 19:07:42.444541 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.444473 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-5cfhz\"" Apr 24 19:07:42.445496 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.445475 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" Apr 24 19:07:42.453271 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.453245 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:42.470849 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.470704 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-czt27\"" Apr 24 19:07:42.477495 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.477436 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" Apr 24 19:07:42.497818 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.497788 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8pmm\"" Apr 24 19:07:42.504023 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.503876 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sfgb9" Apr 24 19:07:42.529776 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.529747 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rh56f\"" Apr 24 19:07:42.539963 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.538399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:42.587706 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.587673 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5db89954c9-qmt95"] Apr 24 19:07:42.629710 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.629677 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p"] Apr 24 19:07:42.641692 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:42.641603 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7503386e_4776_4896_bd44_4ef455ac6b98.slice/crio-f78c8b4b296cfda6de5cf8a9655145db43f514899798d212467ae9b82c84539c WatchSource:0}: Error finding container f78c8b4b296cfda6de5cf8a9655145db43f514899798d212467ae9b82c84539c: Status 404 returned error can't find the container with id f78c8b4b296cfda6de5cf8a9655145db43f514899798d212467ae9b82c84539c Apr 24 19:07:42.653929 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.651709 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5dd48cf8b4-xldzq"] Apr 24 19:07:42.657248 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:42.657213 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7f009e5_c09c_49ca_a4b9_f6dc4bf1ac3e.slice/crio-1e33e6bd7a0f013cbaa59e6033b63c271dfe3ab1bcf5f2500ba571ec7c8c9690 WatchSource:0}: Error finding container 1e33e6bd7a0f013cbaa59e6033b63c271dfe3ab1bcf5f2500ba571ec7c8c9690: Status 404 returned error can't find the container with id 1e33e6bd7a0f013cbaa59e6033b63c271dfe3ab1bcf5f2500ba571ec7c8c9690 Apr 24 19:07:42.673571 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.673352 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw"] Apr 24 19:07:42.695388 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.695180 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sfgb9"] Apr 24 19:07:42.701194 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:42.701162 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94eee2e4_7d5d_49be_ab39_13cd92cf877f.slice/crio-fa4516cc22ba536777e9221e68158e5bcad77a909c2fcba030db59b5b66d29d2 WatchSource:0}: Error finding container fa4516cc22ba536777e9221e68158e5bcad77a909c2fcba030db59b5b66d29d2: Status 404 returned error can't find the container with id fa4516cc22ba536777e9221e68158e5bcad77a909c2fcba030db59b5b66d29d2 Apr 24 19:07:42.726718 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.726690 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qxfsm"] Apr 24 19:07:42.738051 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:42.738018 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85baa476_8a9a_44b6_83c0_0050c6c28921.slice/crio-252bc3f49e226f85be169e00427544549c61e0fa55d72d5d8732505cfe8c6e05 WatchSource:0}: Error finding container 252bc3f49e226f85be169e00427544549c61e0fa55d72d5d8732505cfe8c6e05: Status 404 returned error can't find the container with id 252bc3f49e226f85be169e00427544549c61e0fa55d72d5d8732505cfe8c6e05 Apr 24 19:07:42.762138 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.762081 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" event={"ID":"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e","Type":"ContainerStarted","Data":"b0493be009e5c26916be9addd772ccce99261a6aa7507b5dc4f9360884be8013"} Apr 24 19:07:42.762356 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.762146 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" event={"ID":"b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e","Type":"ContainerStarted","Data":"1e33e6bd7a0f013cbaa59e6033b63c271dfe3ab1bcf5f2500ba571ec7c8c9690"} Apr 24 19:07:42.763685 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.763653 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" event={"ID":"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54","Type":"ContainerStarted","Data":"951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1"} Apr 24 19:07:42.763830 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.763692 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" event={"ID":"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54","Type":"ContainerStarted","Data":"77524cc3565181fa93f00c2d0a7fe48ee639ea4d51805085e8a0fab224ba89f4"} Apr 24 19:07:42.763830 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.763744 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:07:42.764737 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.764716 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sfgb9" event={"ID":"94eee2e4-7d5d-49be-ab39-13cd92cf877f","Type":"ContainerStarted","Data":"fa4516cc22ba536777e9221e68158e5bcad77a909c2fcba030db59b5b66d29d2"} Apr 24 19:07:42.765810 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.765784 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" event={"ID":"7503386e-4776-4896-bd44-4ef455ac6b98","Type":"ContainerStarted","Data":"f78c8b4b296cfda6de5cf8a9655145db43f514899798d212467ae9b82c84539c"} Apr 24 19:07:42.766867 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.766836 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qxfsm" event={"ID":"85baa476-8a9a-44b6-83c0-0050c6c28921","Type":"ContainerStarted","Data":"252bc3f49e226f85be169e00427544549c61e0fa55d72d5d8732505cfe8c6e05"} Apr 24 19:07:42.767958 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.767936 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" event={"ID":"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b","Type":"ContainerStarted","Data":"b88c91600bbcfde39cee195308a2e6d02bd0395094f43728e2be22feb6454b38"} Apr 24 19:07:42.784255 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:42.784208 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" podStartSLOduration=64.784194943 podStartE2EDuration="1m4.784194943s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:42.784064868 +0000 UTC m=+65.895901220" watchObservedRunningTime="2026-04-24 19:07:42.784194943 +0000 UTC m=+65.896031290" Apr 24 19:07:43.202267 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:43.202230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:43.205194 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:43.205159 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76172245-47dc-4f2f-90c9-d345a816e233-metrics-certs\") pod \"network-metrics-daemon-nghhh\" (UID: \"76172245-47dc-4f2f-90c9-d345a816e233\") " pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:43.477595 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:43.477259 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjm4v\"" Apr 24 19:07:43.484900 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:43.484854 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nghhh" Apr 24 19:07:43.674016 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:43.673960 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nghhh"] Apr 24 19:07:43.775177 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:43.774982 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nghhh" event={"ID":"76172245-47dc-4f2f-90c9-d345a816e233","Type":"ContainerStarted","Data":"1a5e746d5c2a116927dd3fd21f9c2a8a45236763d8a709fd8028471b9cd51244"} Apr 24 19:07:43.797083 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:43.796310 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" podStartSLOduration=59.796289588 podStartE2EDuration="59.796289588s" podCreationTimestamp="2026-04-24 19:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:43.795127015 +0000 UTC m=+66.906963360" watchObservedRunningTime="2026-04-24 19:07:43.796289588 +0000 UTC m=+66.908125935" Apr 24 19:07:44.453652 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:44.453603 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:44.455268 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:44.455245 2568 scope.go:117] "RemoveContainer" containerID="4f6a7eb0b8e15ae155728c1c41438be75ae29bb97fdc3b2abce4e381f2b6e1b4" Apr 24 19:07:44.456547 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:44.456524 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:44.778547 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:44.778456 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:44.780117 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:44.780070 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5dd48cf8b4-xldzq" Apr 24 19:07:46.785507 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.785379 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:07:46.785507 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.785473 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" event={"ID":"f3d37867-8a80-4198-9320-281682c54121","Type":"ContainerStarted","Data":"53df8db65a4c0ac45982bc18c745d71f48d285e9be90f468c7bfc8142f866f16"} Apr 24 19:07:46.786048 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.785874 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:46.787538 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.787509 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" event={"ID":"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b","Type":"ContainerStarted","Data":"7924d6a41d3d931710fd553fa3edf0fe1a136ddc80b6c1499644bd41d5df4606"} Apr 24 19:07:46.787668 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.787542 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" event={"ID":"8e6a9b64-e53c-4f21-b0b9-61491d1bef6b","Type":"ContainerStarted","Data":"c08b7128c33bd4f20d85c0903e21515a4a987583c458ee57a8639242ad0c1734"} Apr 24 19:07:46.788974 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.788944 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sfgb9" event={"ID":"94eee2e4-7d5d-49be-ab39-13cd92cf877f","Type":"ContainerStarted","Data":"211ee2170ad184bdf9d6168d76bb355411f193a81602a49ba4c8ee727264ada9"} Apr 24 19:07:46.790595 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.790566 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" event={"ID":"7503386e-4776-4896-bd44-4ef455ac6b98","Type":"ContainerStarted","Data":"0556680c19e82637be8f40413baffc009ae752ccb8d1641ec88245a56c88cc88"} Apr 24 19:07:46.791026 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.791003 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" Apr 24 19:07:46.792263 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.792239 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qxfsm" event={"ID":"85baa476-8a9a-44b6-83c0-0050c6c28921","Type":"ContainerStarted","Data":"2ff1de3faa1b77d9bf4db5f735871236528e1de58e34c59e87196b3f9dc50af5"} Apr 24 19:07:46.792263 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.792270 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qxfsm" event={"ID":"85baa476-8a9a-44b6-83c0-0050c6c28921","Type":"ContainerStarted","Data":"0c749ca8492d42b05a2e4afdc0a033351911d678cdfb7b8301b6709b3a4a347b"} Apr 24 19:07:46.792434 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.792358 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:46.794023 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.793995 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nghhh" event={"ID":"76172245-47dc-4f2f-90c9-d345a816e233","Type":"ContainerStarted","Data":"f47b3538fef3934bae6a27aff64c6b8720081428fd0fa4422e09f1c658ba22b8"} Apr 24 19:07:46.794148 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.794026 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nghhh" event={"ID":"76172245-47dc-4f2f-90c9-d345a816e233","Type":"ContainerStarted","Data":"ef82f17139bb51a3694027f05d7198faaf531eb898b687df1108ee193371cbeb"} Apr 24 19:07:46.828627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.828561 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-zsl4c" podStartSLOduration=43.848476287 podStartE2EDuration="52.828541245s" podCreationTimestamp="2026-04-24 19:06:54 +0000 UTC" firstStartedPulling="2026-04-24 19:07:11.746629145 +0000 UTC m=+34.858465488" lastFinishedPulling="2026-04-24 19:07:20.726694108 +0000 UTC m=+43.838530446" observedRunningTime="2026-04-24 19:07:46.826830336 +0000 UTC m=+69.938666686" watchObservedRunningTime="2026-04-24 19:07:46.828541245 +0000 UTC m=+69.940377591" Apr 24 19:07:46.880575 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.880477 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nghhh" podStartSLOduration=67.333659175 podStartE2EDuration="1m9.880459762s" podCreationTimestamp="2026-04-24 19:06:37 +0000 UTC" firstStartedPulling="2026-04-24 19:07:43.682211344 +0000 UTC m=+66.794047685" lastFinishedPulling="2026-04-24 19:07:46.229011948 +0000 UTC m=+69.340848272" observedRunningTime="2026-04-24 19:07:46.879688746 +0000 UTC m=+69.991525096" watchObservedRunningTime="2026-04-24 19:07:46.880459762 +0000 UTC m=+69.992296110" Apr 24 19:07:46.922270 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.922193 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9lv7p" podStartSLOduration=59.393952462 podStartE2EDuration="1m2.922172209s" podCreationTimestamp="2026-04-24 19:06:44 +0000 UTC" firstStartedPulling="2026-04-24 19:07:42.644733438 +0000 UTC m=+65.756569764" lastFinishedPulling="2026-04-24 19:07:46.172953169 +0000 UTC m=+69.284789511" observedRunningTime="2026-04-24 19:07:46.902957229 +0000 UTC m=+70.014793587" watchObservedRunningTime="2026-04-24 19:07:46.922172209 +0000 UTC m=+70.034008553" Apr 24 19:07:46.922515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.922472 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2rfkw" podStartSLOduration=49.492462752 podStartE2EDuration="52.922461254s" podCreationTimestamp="2026-04-24 19:06:54 +0000 UTC" firstStartedPulling="2026-04-24 19:07:42.746807587 +0000 UTC m=+65.858643924" lastFinishedPulling="2026-04-24 19:07:46.176806086 +0000 UTC m=+69.288642426" observedRunningTime="2026-04-24 19:07:46.921453262 +0000 UTC m=+70.033289609" watchObservedRunningTime="2026-04-24 19:07:46.922461254 +0000 UTC m=+70.034297625" Apr 24 19:07:46.971170 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.971115 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sfgb9" podStartSLOduration=33.498457238 podStartE2EDuration="36.971082584s" podCreationTimestamp="2026-04-24 19:07:10 +0000 UTC" firstStartedPulling="2026-04-24 19:07:42.704145865 +0000 UTC m=+65.815982195" lastFinishedPulling="2026-04-24 19:07:46.176771206 +0000 UTC m=+69.288607541" observedRunningTime="2026-04-24 19:07:46.969280965 +0000 UTC m=+70.081117339" watchObservedRunningTime="2026-04-24 19:07:46.971082584 +0000 UTC m=+70.082918929" Apr 24 19:07:46.993732 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:46.993682 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qxfsm" podStartSLOduration=33.55737973 podStartE2EDuration="36.993665956s" podCreationTimestamp="2026-04-24 19:07:10 +0000 UTC" firstStartedPulling="2026-04-24 19:07:42.739935437 +0000 UTC m=+65.851771764" lastFinishedPulling="2026-04-24 19:07:46.17622166 +0000 UTC m=+69.288057990" observedRunningTime="2026-04-24 19:07:46.992231587 +0000 UTC m=+70.104067932" watchObservedRunningTime="2026-04-24 19:07:46.993665956 +0000 UTC m=+70.105502302" Apr 24 19:07:47.410220 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.410179 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt"] Apr 24 19:07:47.413369 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.413340 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" Apr 24 19:07:47.416886 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.416486 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 19:07:47.416886 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.416762 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 19:07:47.417944 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.417924 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-wglcw\"" Apr 24 19:07:47.427785 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.427760 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt"] Apr 24 19:07:47.537781 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.537750 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579"] Apr 24 19:07:47.540978 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.540944 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-lp9kx"] Apr 24 19:07:47.541220 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.541123 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" Apr 24 19:07:47.544450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.544426 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-f7f9r\"" Apr 24 19:07:47.544450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.544437 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lp9kx" Apr 24 19:07:47.544670 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.544594 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 19:07:47.546498 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.546471 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ef2156a7-e5a4-42ca-8af9-87d90a778914-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qxjnt\" (UID: \"ef2156a7-e5a4-42ca-8af9-87d90a778914\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" Apr 24 19:07:47.546616 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.546534 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ef2156a7-e5a4-42ca-8af9-87d90a778914-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qxjnt\" (UID: \"ef2156a7-e5a4-42ca-8af9-87d90a778914\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" Apr 24 19:07:47.547892 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.547870 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 19:07:47.548188 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.548167 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-l2l9h\"" Apr 24 19:07:47.548776 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.548754 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 19:07:47.554891 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.553747 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tt4dv"] Apr 24 19:07:47.557421 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.557398 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579"] Apr 24 19:07:47.557554 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.557538 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.560423 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.560402 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bjv5f\"" Apr 24 19:07:47.560591 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.560482 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 19:07:47.561869 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.561851 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 19:07:47.573986 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.573959 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tt4dv"] Apr 24 19:07:47.575162 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.575144 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lp9kx"] Apr 24 19:07:47.647529 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.647479 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srkhp\" (UniqueName: \"kubernetes.io/projected/762ba37c-b963-4e59-873c-3dbffed98ff1-kube-api-access-srkhp\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.647948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.647540 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ef2156a7-e5a4-42ca-8af9-87d90a778914-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qxjnt\" (UID: \"ef2156a7-e5a4-42ca-8af9-87d90a778914\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" Apr 24 19:07:47.647948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.647616 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/762ba37c-b963-4e59-873c-3dbffed98ff1-crio-socket\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.647948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.647692 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ef2156a7-e5a4-42ca-8af9-87d90a778914-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qxjnt\" (UID: \"ef2156a7-e5a4-42ca-8af9-87d90a778914\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" Apr 24 19:07:47.647948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.647732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrgpp\" (UniqueName: \"kubernetes.io/projected/a2da8ca4-ca58-4a9f-b9fd-8ff5e9093163-kube-api-access-jrgpp\") pod \"downloads-6bcc868b7-lp9kx\" (UID: \"a2da8ca4-ca58-4a9f-b9fd-8ff5e9093163\") " pod="openshift-console/downloads-6bcc868b7-lp9kx" Apr 24 19:07:47.647948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.647771 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/762ba37c-b963-4e59-873c-3dbffed98ff1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.647948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.647797 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7b502bbd-c353-478c-a6e6-215d2dd5c38b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g6579\" (UID: \"7b502bbd-c353-478c-a6e6-215d2dd5c38b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" Apr 24 19:07:47.647948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.647816 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/762ba37c-b963-4e59-873c-3dbffed98ff1-data-volume\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.647948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.647911 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/762ba37c-b963-4e59-873c-3dbffed98ff1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.648342 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.648204 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ef2156a7-e5a4-42ca-8af9-87d90a778914-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qxjnt\" (UID: \"ef2156a7-e5a4-42ca-8af9-87d90a778914\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" Apr 24 19:07:47.650165 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.650142 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ef2156a7-e5a4-42ca-8af9-87d90a778914-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qxjnt\" (UID: \"ef2156a7-e5a4-42ca-8af9-87d90a778914\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" Apr 24 19:07:47.730086 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.729999 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" Apr 24 19:07:47.748933 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.748902 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/762ba37c-b963-4e59-873c-3dbffed98ff1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.749139 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.748945 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7b502bbd-c353-478c-a6e6-215d2dd5c38b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g6579\" (UID: \"7b502bbd-c353-478c-a6e6-215d2dd5c38b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" Apr 24 19:07:47.749139 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.748965 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/762ba37c-b963-4e59-873c-3dbffed98ff1-data-volume\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.749139 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.748995 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/762ba37c-b963-4e59-873c-3dbffed98ff1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.749139 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.749032 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srkhp\" (UniqueName: \"kubernetes.io/projected/762ba37c-b963-4e59-873c-3dbffed98ff1-kube-api-access-srkhp\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.749139 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.749068 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/762ba37c-b963-4e59-873c-3dbffed98ff1-crio-socket\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.749412 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.749146 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrgpp\" (UniqueName: \"kubernetes.io/projected/a2da8ca4-ca58-4a9f-b9fd-8ff5e9093163-kube-api-access-jrgpp\") pod \"downloads-6bcc868b7-lp9kx\" (UID: \"a2da8ca4-ca58-4a9f-b9fd-8ff5e9093163\") " pod="openshift-console/downloads-6bcc868b7-lp9kx" Apr 24 19:07:47.749412 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.749252 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/762ba37c-b963-4e59-873c-3dbffed98ff1-crio-socket\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.749412 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.749397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/762ba37c-b963-4e59-873c-3dbffed98ff1-data-volume\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.749723 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.749704 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/762ba37c-b963-4e59-873c-3dbffed98ff1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.751671 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.751636 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7b502bbd-c353-478c-a6e6-215d2dd5c38b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g6579\" (UID: \"7b502bbd-c353-478c-a6e6-215d2dd5c38b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" Apr 24 19:07:47.752070 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.752046 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/762ba37c-b963-4e59-873c-3dbffed98ff1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.762890 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.762849 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrgpp\" (UniqueName: \"kubernetes.io/projected/a2da8ca4-ca58-4a9f-b9fd-8ff5e9093163-kube-api-access-jrgpp\") pod \"downloads-6bcc868b7-lp9kx\" (UID: \"a2da8ca4-ca58-4a9f-b9fd-8ff5e9093163\") " pod="openshift-console/downloads-6bcc868b7-lp9kx" Apr 24 19:07:47.764333 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.764311 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srkhp\" (UniqueName: \"kubernetes.io/projected/762ba37c-b963-4e59-873c-3dbffed98ff1-kube-api-access-srkhp\") pod \"insights-runtime-extractor-tt4dv\" (UID: \"762ba37c-b963-4e59-873c-3dbffed98ff1\") " pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:47.858462 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.858426 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" Apr 24 19:07:47.865534 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.865490 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lp9kx" Apr 24 19:07:47.869404 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.869376 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt"] Apr 24 19:07:47.871951 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:47.871921 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef2156a7_e5a4_42ca_8af9_87d90a778914.slice/crio-c2220953a592ab12e7fd4d4d306794a8543e073a2aada72b01724b91477ae90e WatchSource:0}: Error finding container c2220953a592ab12e7fd4d4d306794a8543e073a2aada72b01724b91477ae90e: Status 404 returned error can't find the container with id c2220953a592ab12e7fd4d4d306794a8543e073a2aada72b01724b91477ae90e Apr 24 19:07:47.875152 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:47.875124 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tt4dv" Apr 24 19:07:48.017595 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:48.017531 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579"] Apr 24 19:07:48.020465 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:48.020437 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b502bbd_c353_478c_a6e6_215d2dd5c38b.slice/crio-cfb7010d904624333f0757ebd3969f4a19952e6bd959ba255c8a7448abf518f3 WatchSource:0}: Error finding container cfb7010d904624333f0757ebd3969f4a19952e6bd959ba255c8a7448abf518f3: Status 404 returned error can't find the container with id cfb7010d904624333f0757ebd3969f4a19952e6bd959ba255c8a7448abf518f3 Apr 24 19:07:48.044045 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:48.044016 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lp9kx"] Apr 24 19:07:48.047705 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:48.047675 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2da8ca4_ca58_4a9f_b9fd_8ff5e9093163.slice/crio-1818fd0b38d102b699632edcc08a3827e2576aaa4e7c5d4f11001ced631c56c9 WatchSource:0}: Error finding container 1818fd0b38d102b699632edcc08a3827e2576aaa4e7c5d4f11001ced631c56c9: Status 404 returned error can't find the container with id 1818fd0b38d102b699632edcc08a3827e2576aaa4e7c5d4f11001ced631c56c9 Apr 24 19:07:48.062954 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:48.062923 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tt4dv"] Apr 24 19:07:48.066763 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:48.066738 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod762ba37c_b963_4e59_873c_3dbffed98ff1.slice/crio-b2b98b4dc47cef5f982ff0b601fe7df6dfcbd128ef88b0b895bb14104e23bccd WatchSource:0}: Error finding container b2b98b4dc47cef5f982ff0b601fe7df6dfcbd128ef88b0b895bb14104e23bccd: Status 404 returned error can't find the container with id b2b98b4dc47cef5f982ff0b601fe7df6dfcbd128ef88b0b895bb14104e23bccd Apr 24 19:07:48.803221 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:48.803161 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" event={"ID":"ef2156a7-e5a4-42ca-8af9-87d90a778914","Type":"ContainerStarted","Data":"c2220953a592ab12e7fd4d4d306794a8543e073a2aada72b01724b91477ae90e"} Apr 24 19:07:48.805290 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:48.805216 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tt4dv" event={"ID":"762ba37c-b963-4e59-873c-3dbffed98ff1","Type":"ContainerStarted","Data":"9a5cfac6a10f7c29c6071612aa144f1343ac3496f08365b136dd7f4dee86645b"} Apr 24 19:07:48.805290 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:48.805255 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tt4dv" event={"ID":"762ba37c-b963-4e59-873c-3dbffed98ff1","Type":"ContainerStarted","Data":"b2b98b4dc47cef5f982ff0b601fe7df6dfcbd128ef88b0b895bb14104e23bccd"} Apr 24 19:07:48.807093 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:48.807061 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lp9kx" event={"ID":"a2da8ca4-ca58-4a9f-b9fd-8ff5e9093163","Type":"ContainerStarted","Data":"1818fd0b38d102b699632edcc08a3827e2576aaa4e7c5d4f11001ced631c56c9"} Apr 24 19:07:48.808965 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:48.808931 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" event={"ID":"7b502bbd-c353-478c-a6e6-215d2dd5c38b","Type":"ContainerStarted","Data":"cfb7010d904624333f0757ebd3969f4a19952e6bd959ba255c8a7448abf518f3"} Apr 24 19:07:50.817158 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:50.817115 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tt4dv" event={"ID":"762ba37c-b963-4e59-873c-3dbffed98ff1","Type":"ContainerStarted","Data":"a214da7bd1444e7a80576e8f2cbaf4ea7f65dd45638e267ba5497747e148171b"} Apr 24 19:07:50.819188 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:50.818875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" event={"ID":"7b502bbd-c353-478c-a6e6-215d2dd5c38b","Type":"ContainerStarted","Data":"e15e9e34d440cf28bddbd1c51be4e8ee1a2dff8f48381015a50421159f90c3a4"} Apr 24 19:07:50.819188 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:50.819016 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" Apr 24 19:07:50.820619 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:50.820571 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" event={"ID":"ef2156a7-e5a4-42ca-8af9-87d90a778914","Type":"ContainerStarted","Data":"cbc55cffca6d301721e36da8aec821a07efc2ce73f1817ecc904b11ffe2774d6"} Apr 24 19:07:50.826045 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:50.826023 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" Apr 24 19:07:50.839351 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:50.839294 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6579" podStartSLOduration=2.054500842 podStartE2EDuration="3.839276379s" podCreationTimestamp="2026-04-24 19:07:47 +0000 UTC" firstStartedPulling="2026-04-24 19:07:48.022523034 +0000 UTC m=+71.134359359" lastFinishedPulling="2026-04-24 19:07:49.807298572 +0000 UTC m=+72.919134896" observedRunningTime="2026-04-24 19:07:50.837466037 +0000 UTC m=+73.949302406" watchObservedRunningTime="2026-04-24 19:07:50.839276379 +0000 UTC m=+73.951112727" Apr 24 19:07:50.861229 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:50.861095 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qxjnt" podStartSLOduration=1.927312479 podStartE2EDuration="3.861074362s" podCreationTimestamp="2026-04-24 19:07:47 +0000 UTC" firstStartedPulling="2026-04-24 19:07:47.87373647 +0000 UTC m=+70.985572794" lastFinishedPulling="2026-04-24 19:07:49.807498343 +0000 UTC m=+72.919334677" observedRunningTime="2026-04-24 19:07:50.858064177 +0000 UTC m=+73.969900523" watchObservedRunningTime="2026-04-24 19:07:50.861074362 +0000 UTC m=+73.972910709" Apr 24 19:07:52.710696 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:52.710664 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nlzd4" Apr 24 19:07:52.828922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:52.828884 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tt4dv" event={"ID":"762ba37c-b963-4e59-873c-3dbffed98ff1","Type":"ContainerStarted","Data":"ee44500771f8ad3e2b0759a4ceb411e7753388490814f86ccee44136c434d491"} Apr 24 19:07:52.849845 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:52.849797 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tt4dv" podStartSLOduration=1.8233158999999999 podStartE2EDuration="5.849781033s" podCreationTimestamp="2026-04-24 19:07:47 +0000 UTC" firstStartedPulling="2026-04-24 19:07:48.136095632 +0000 UTC m=+71.247931960" lastFinishedPulling="2026-04-24 19:07:52.16256075 +0000 UTC m=+75.274397093" observedRunningTime="2026-04-24 19:07:52.849376809 +0000 UTC m=+75.961213160" watchObservedRunningTime="2026-04-24 19:07:52.849781033 +0000 UTC m=+75.961617409" Apr 24 19:07:56.801675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:56.801548 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qxfsm" Apr 24 19:07:57.267681 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.267639 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v"] Apr 24 19:07:57.273627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.273603 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.277400 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.277377 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 19:07:57.277593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.277521 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 19:07:57.277717 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.277699 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-2t4dz\"" Apr 24 19:07:57.279254 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.279235 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 19:07:57.288635 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.288608 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v"] Apr 24 19:07:57.334431 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.334395 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-f8rpr"] Apr 24 19:07:57.338287 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.338265 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.341726 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.341704 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 19:07:57.341840 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.341706 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 19:07:57.343858 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.343839 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2nlmz\"" Apr 24 19:07:57.344527 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.344382 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 19:07:57.438429 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438398 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed502cab-a65d-4d95-91d4-aa59376937de-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.438627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438445 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d5bc99d3-816b-40e4-958b-a40410565822-root\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.438627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438469 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5bc99d3-816b-40e4-958b-a40410565822-sys\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.438627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438494 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5bc99d3-816b-40e4-958b-a40410565822-metrics-client-ca\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.438627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjm8\" (UniqueName: \"kubernetes.io/projected/ed502cab-a65d-4d95-91d4-aa59376937de-kube-api-access-hdjm8\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.438627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438558 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-wtmp\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.438627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438585 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-textfile\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.438627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438624 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-tls\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.438988 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438682 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.438988 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438722 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2wkt\" (UniqueName: \"kubernetes.io/projected/d5bc99d3-816b-40e4-958b-a40410565822-kube-api-access-g2wkt\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.438988 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438751 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed502cab-a65d-4d95-91d4-aa59376937de-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.438988 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438798 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed502cab-a65d-4d95-91d4-aa59376937de-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.438988 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.438850 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-accelerators-collector-config\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.539794 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539711 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjm8\" (UniqueName: \"kubernetes.io/projected/ed502cab-a65d-4d95-91d4-aa59376937de-kube-api-access-hdjm8\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.539794 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-wtmp\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.539794 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-textfile\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.539794 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-tls\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540145 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539817 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540145 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2wkt\" (UniqueName: \"kubernetes.io/projected/d5bc99d3-816b-40e4-958b-a40410565822-kube-api-access-g2wkt\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540145 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed502cab-a65d-4d95-91d4-aa59376937de-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.540145 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed502cab-a65d-4d95-91d4-aa59376937de-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.540145 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539902 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-wtmp\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540397 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.539910 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-accelerators-collector-config\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540502 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.540468 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed502cab-a65d-4d95-91d4-aa59376937de-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.540636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.540527 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d5bc99d3-816b-40e4-958b-a40410565822-root\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.540539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-accelerators-collector-config\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540636 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:57.540558 2568 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 19:07:57.540636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.540253 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-textfile\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.540605 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5bc99d3-816b-40e4-958b-a40410565822-sys\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.540615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d5bc99d3-816b-40e4-958b-a40410565822-root\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.540636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.540621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed502cab-a65d-4d95-91d4-aa59376937de-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.540636 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:07:57.540638 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed502cab-a65d-4d95-91d4-aa59376937de-openshift-state-metrics-tls podName:ed502cab-a65d-4d95-91d4-aa59376937de nodeName:}" failed. No retries permitted until 2026-04-24 19:07:58.040617121 +0000 UTC m=+81.152453450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/ed502cab-a65d-4d95-91d4-aa59376937de-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-kq56v" (UID: "ed502cab-a65d-4d95-91d4-aa59376937de") : secret "openshift-state-metrics-tls" not found Apr 24 19:07:57.541034 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.540560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5bc99d3-816b-40e4-958b-a40410565822-sys\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.541034 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.540677 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5bc99d3-816b-40e4-958b-a40410565822-metrics-client-ca\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.541132 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.541063 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5bc99d3-816b-40e4-958b-a40410565822-metrics-client-ca\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.542436 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.542413 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.542789 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.542769 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed502cab-a65d-4d95-91d4-aa59376937de-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.542860 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.542791 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d5bc99d3-816b-40e4-958b-a40410565822-node-exporter-tls\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.548366 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.548341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2wkt\" (UniqueName: \"kubernetes.io/projected/d5bc99d3-816b-40e4-958b-a40410565822-kube-api-access-g2wkt\") pod \"node-exporter-f8rpr\" (UID: \"d5bc99d3-816b-40e4-958b-a40410565822\") " pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.548465 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.548341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjm8\" (UniqueName: \"kubernetes.io/projected/ed502cab-a65d-4d95-91d4-aa59376937de-kube-api-access-hdjm8\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:57.647731 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.647700 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f8rpr" Apr 24 19:07:57.658947 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:57.658913 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5bc99d3_816b_40e4_958b_a40410565822.slice/crio-fef709cd95e2535be272f3b85619d46d1dd5eea6009ddecb6a2656ba79b9af67 WatchSource:0}: Error finding container fef709cd95e2535be272f3b85619d46d1dd5eea6009ddecb6a2656ba79b9af67: Status 404 returned error can't find the container with id fef709cd95e2535be272f3b85619d46d1dd5eea6009ddecb6a2656ba79b9af67 Apr 24 19:07:57.845465 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:57.845375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f8rpr" event={"ID":"d5bc99d3-816b-40e4-958b-a40410565822","Type":"ContainerStarted","Data":"fef709cd95e2535be272f3b85619d46d1dd5eea6009ddecb6a2656ba79b9af67"} Apr 24 19:07:58.043574 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.043529 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed502cab-a65d-4d95-91d4-aa59376937de-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:58.046771 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.046741 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed502cab-a65d-4d95-91d4-aa59376937de-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kq56v\" (UID: \"ed502cab-a65d-4d95-91d4-aa59376937de\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:58.185249 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.185214 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" Apr 24 19:07:58.412994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.412931 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:07:58.417702 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.416814 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.422124 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.421927 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 19:07:58.422124 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.421992 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 19:07:58.422490 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.422318 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 19:07:58.422490 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.422400 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 19:07:58.422490 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.422475 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 19:07:58.422682 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.422609 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 19:07:58.422738 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.422702 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 19:07:58.422738 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.422715 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 19:07:58.422836 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.422799 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-zrwr2\"" Apr 24 19:07:58.422836 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.422825 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 19:07:58.435736 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.435673 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:07:58.531698 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.531667 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v"] Apr 24 19:07:58.535044 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:07:58.535013 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded502cab_a65d_4d95_91d4_aa59376937de.slice/crio-e4d3bcac523418eb1b0168ad2a0ec47d9332d8486b6313fc5046db15985e2b00 WatchSource:0}: Error finding container e4d3bcac523418eb1b0168ad2a0ec47d9332d8486b6313fc5046db15985e2b00: Status 404 returned error can't find the container with id e4d3bcac523418eb1b0168ad2a0ec47d9332d8486b6313fc5046db15985e2b00 Apr 24 19:07:58.547572 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547539 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.547687 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547605 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6njh\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-kube-api-access-g6njh\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.547687 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547643 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.547687 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547670 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-volume\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.547864 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547696 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.547864 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547722 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.547864 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.547864 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.548053 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547913 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-web-config\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.548053 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547935 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.548053 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.547960 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.548053 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.548000 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-out\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.548053 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.548050 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649225 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-web-config\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649387 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649236 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649387 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649265 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649387 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649306 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-out\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649387 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649387 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6njh\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-kube-api-access-g6njh\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649472 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-volume\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649521 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649550 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649691 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.649691 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.649631 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.651618 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.651284 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.651742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.651628 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.652344 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.652310 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.652683 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.652659 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-volume\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.656289 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.655850 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.656289 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.655865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.656289 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.656125 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-web-config\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.656289 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.656254 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.656554 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.656471 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.656942 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.656630 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-out\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.656942 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.656908 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.657300 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.657279 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.661112 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.661080 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6njh\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-kube-api-access-g6njh\") pod \"alertmanager-main-0\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.735069 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.735041 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:07:58.853724 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.853630 2568 generic.go:358] "Generic (PLEG): container finished" podID="d5bc99d3-816b-40e4-958b-a40410565822" containerID="06ad70805c5228559194274db1812e462076a33ded5df253e907aec515486bf3" exitCode=0 Apr 24 19:07:58.853724 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.853695 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f8rpr" event={"ID":"d5bc99d3-816b-40e4-958b-a40410565822","Type":"ContainerDied","Data":"06ad70805c5228559194274db1812e462076a33ded5df253e907aec515486bf3"} Apr 24 19:07:58.856281 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.856239 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" event={"ID":"ed502cab-a65d-4d95-91d4-aa59376937de","Type":"ContainerStarted","Data":"f2d31ddfda92104b9e8012346691f64eb554856d8454e73be35d761c74019149"} Apr 24 19:07:58.856281 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.856279 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" event={"ID":"ed502cab-a65d-4d95-91d4-aa59376937de","Type":"ContainerStarted","Data":"3b7913fbd1f023aaae14d311184afae6c980e84bbe271233454406deceb9f1fb"} Apr 24 19:07:58.856424 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.856294 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" event={"ID":"ed502cab-a65d-4d95-91d4-aa59376937de","Type":"ContainerStarted","Data":"e4d3bcac523418eb1b0168ad2a0ec47d9332d8486b6313fc5046db15985e2b00"} Apr 24 19:07:58.893850 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:07:58.893823 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:08:00.344482 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.344444 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd"] Apr 24 19:08:00.350180 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.350156 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.353932 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.353906 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 19:08:00.354051 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.353935 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 19:08:00.354220 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.354070 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 19:08:00.354220 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.354077 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fsugj21oiqrn\"" Apr 24 19:08:00.354369 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.354259 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-br984\"" Apr 24 19:08:00.355042 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.355025 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 19:08:00.355141 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.355027 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 19:08:00.360091 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.360018 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd"] Apr 24 19:08:00.468491 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.468455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxr29\" (UniqueName: \"kubernetes.io/projected/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-kube-api-access-wxr29\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.468675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.468521 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-grpc-tls\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.468675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.468599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.468675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.468632 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-metrics-client-ca\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.468811 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.468721 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.468811 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.468784 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-tls\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.468902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.468837 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.468902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.468870 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.569765 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.569720 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.569930 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.569772 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxr29\" (UniqueName: \"kubernetes.io/projected/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-kube-api-access-wxr29\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.569930 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.569827 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-grpc-tls\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.569930 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.569863 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.569930 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.569894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-metrics-client-ca\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.570179 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.569940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.570179 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.569981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-tls\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.570179 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.570029 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.570769 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.570691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-metrics-client-ca\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.573039 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.573014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.573151 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.573014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.573217 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.573192 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.573636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.573612 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.573735 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.573717 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-grpc-tls\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.573788 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.573716 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-secret-thanos-querier-tls\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.585625 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.585599 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxr29\" (UniqueName: \"kubernetes.io/projected/2c03d4b6-bd8b-48dd-8113-7e5008c145c1-kube-api-access-wxr29\") pod \"thanos-querier-67f5cfbc9c-wfwqd\" (UID: \"2c03d4b6-bd8b-48dd-8113-7e5008c145c1\") " pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:00.663833 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:00.663787 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:01.783905 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.783868 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-58584969c7-cvcpc"] Apr 24 19:08:01.788710 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.788685 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.791571 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.791480 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 19:08:01.791571 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.791519 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 19:08:01.793051 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.793027 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 19:08:01.793197 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.793095 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-nmkb5\"" Apr 24 19:08:01.793197 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.793123 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-eghs8plcqi1qs\"" Apr 24 19:08:01.793323 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.793229 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 19:08:01.795925 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.795904 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-58584969c7-cvcpc"] Apr 24 19:08:01.884168 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.884132 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ee37efac-17ac-4808-b8db-0df62be52e08-audit-log\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.884375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.884176 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee37efac-17ac-4808-b8db-0df62be52e08-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.884375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.884208 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee37efac-17ac-4808-b8db-0df62be52e08-client-ca-bundle\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.884375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.884271 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ee37efac-17ac-4808-b8db-0df62be52e08-secret-metrics-server-tls\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.884375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.884328 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ee37efac-17ac-4808-b8db-0df62be52e08-metrics-server-audit-profiles\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.884564 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.884403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ee37efac-17ac-4808-b8db-0df62be52e08-secret-metrics-server-client-certs\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.884564 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.884449 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh5dk\" (UniqueName: \"kubernetes.io/projected/ee37efac-17ac-4808-b8db-0df62be52e08-kube-api-access-kh5dk\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.985987 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.985937 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ee37efac-17ac-4808-b8db-0df62be52e08-metrics-server-audit-profiles\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.986183 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.986032 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ee37efac-17ac-4808-b8db-0df62be52e08-secret-metrics-server-client-certs\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.986183 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.986070 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kh5dk\" (UniqueName: \"kubernetes.io/projected/ee37efac-17ac-4808-b8db-0df62be52e08-kube-api-access-kh5dk\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.986183 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.986129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ee37efac-17ac-4808-b8db-0df62be52e08-audit-log\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.986183 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.986162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee37efac-17ac-4808-b8db-0df62be52e08-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.986423 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.986189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee37efac-17ac-4808-b8db-0df62be52e08-client-ca-bundle\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.986423 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.986217 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ee37efac-17ac-4808-b8db-0df62be52e08-secret-metrics-server-tls\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.986676 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.986646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ee37efac-17ac-4808-b8db-0df62be52e08-audit-log\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.987237 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.987188 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ee37efac-17ac-4808-b8db-0df62be52e08-metrics-server-audit-profiles\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.987764 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.987736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee37efac-17ac-4808-b8db-0df62be52e08-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.989333 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.989310 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ee37efac-17ac-4808-b8db-0df62be52e08-secret-metrics-server-client-certs\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.989460 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.989431 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee37efac-17ac-4808-b8db-0df62be52e08-client-ca-bundle\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.989556 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.989439 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ee37efac-17ac-4808-b8db-0df62be52e08-secret-metrics-server-tls\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:01.996088 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:01.996058 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj"] Apr 24 19:08:02.000862 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.000811 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" Apr 24 19:08:02.002952 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.002926 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh5dk\" (UniqueName: \"kubernetes.io/projected/ee37efac-17ac-4808-b8db-0df62be52e08-kube-api-access-kh5dk\") pod \"metrics-server-58584969c7-cvcpc\" (UID: \"ee37efac-17ac-4808-b8db-0df62be52e08\") " pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:02.003640 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.003618 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 19:08:02.003739 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.003625 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-h4j94\"" Apr 24 19:08:02.006405 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.006385 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj"] Apr 24 19:08:02.102092 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.102060 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:02.187894 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.187850 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6dd835bc-4aa2-4709-9346-da69bea29d70-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4kktj\" (UID: \"6dd835bc-4aa2-4709-9346-da69bea29d70\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" Apr 24 19:08:02.289495 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.289451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6dd835bc-4aa2-4709-9346-da69bea29d70-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4kktj\" (UID: \"6dd835bc-4aa2-4709-9346-da69bea29d70\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" Apr 24 19:08:02.289685 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:08:02.289628 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 19:08:02.289747 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:08:02.289721 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6dd835bc-4aa2-4709-9346-da69bea29d70-monitoring-plugin-cert podName:6dd835bc-4aa2-4709-9346-da69bea29d70 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:02.789703153 +0000 UTC m=+85.901539477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/6dd835bc-4aa2-4709-9346-da69bea29d70-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-4kktj" (UID: "6dd835bc-4aa2-4709-9346-da69bea29d70") : secret "monitoring-plugin-cert" not found Apr 24 19:08:02.425411 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.425302 2568 patch_prober.go:28] interesting pod/image-registry-5db89954c9-qmt95 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 19:08:02.425573 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.425384 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" podUID="cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:08:02.794235 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.794133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6dd835bc-4aa2-4709-9346-da69bea29d70-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4kktj\" (UID: \"6dd835bc-4aa2-4709-9346-da69bea29d70\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" Apr 24 19:08:02.796949 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.796921 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6dd835bc-4aa2-4709-9346-da69bea29d70-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4kktj\" (UID: \"6dd835bc-4aa2-4709-9346-da69bea29d70\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" Apr 24 19:08:02.926165 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:02.926095 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" Apr 24 19:08:03.780568 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:03.780529 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:08:05.712289 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:08:05.712256 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc790792_524d_4dc1_b6a5_c6d6a75859e7.slice/crio-91994eaae0566a41450f53c4e30087c722f707b1950ffca73eb283f30843f029 WatchSource:0}: Error finding container 91994eaae0566a41450f53c4e30087c722f707b1950ffca73eb283f30843f029: Status 404 returned error can't find the container with id 91994eaae0566a41450f53c4e30087c722f707b1950ffca73eb283f30843f029 Apr 24 19:08:05.885974 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:05.884284 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f8rpr" event={"ID":"d5bc99d3-816b-40e4-958b-a40410565822","Type":"ContainerStarted","Data":"3da94f583629443103628d1dd91dd95df8b6826dfa1642956ce4162f5e0dae3f"} Apr 24 19:08:05.885974 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:05.885601 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerStarted","Data":"91994eaae0566a41450f53c4e30087c722f707b1950ffca73eb283f30843f029"} Apr 24 19:08:05.888852 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:05.887140 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-58584969c7-cvcpc"] Apr 24 19:08:05.909885 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:05.909691 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd"] Apr 24 19:08:05.933135 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:05.927560 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj"] Apr 24 19:08:06.896008 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:06.895945 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f8rpr" event={"ID":"d5bc99d3-816b-40e4-958b-a40410565822","Type":"ContainerStarted","Data":"43fd96fd295b8b87b3bc2921b82ab18d65bb439778edcc474258db8ac269c9e9"} Apr 24 19:08:06.899680 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:06.899488 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" event={"ID":"6dd835bc-4aa2-4709-9346-da69bea29d70","Type":"ContainerStarted","Data":"605c5d200bb4b63c332954f5f11877121b65f8f4c3021e8a657e89f0a671c508"} Apr 24 19:08:06.905070 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:06.904918 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lp9kx" event={"ID":"a2da8ca4-ca58-4a9f-b9fd-8ff5e9093163","Type":"ContainerStarted","Data":"24307cf6ab6ce624c372992456f64327eca5d4685ff41a963c2319d17ee56b90"} Apr 24 19:08:06.906255 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:06.906056 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-lp9kx" Apr 24 19:08:06.907951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:06.907919 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" event={"ID":"ee37efac-17ac-4808-b8db-0df62be52e08","Type":"ContainerStarted","Data":"8c141287b08d2661352771c1409ddb68f649693f68615847f20edd2db157ccac"} Apr 24 19:08:06.909737 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:06.909670 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" event={"ID":"2c03d4b6-bd8b-48dd-8113-7e5008c145c1","Type":"ContainerStarted","Data":"c06f50a14124c216ccfb5f6ad176406ea0fc75e5b7cf22ef11a8293bc99a06f3"} Apr 24 19:08:06.920660 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:06.920389 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-f8rpr" podStartSLOduration=9.175396488 podStartE2EDuration="9.920367961s" podCreationTimestamp="2026-04-24 19:07:57 +0000 UTC" firstStartedPulling="2026-04-24 19:07:57.661209499 +0000 UTC m=+80.773045828" lastFinishedPulling="2026-04-24 19:07:58.406180961 +0000 UTC m=+81.518017301" observedRunningTime="2026-04-24 19:08:06.917878481 +0000 UTC m=+90.029714828" watchObservedRunningTime="2026-04-24 19:08:06.920367961 +0000 UTC m=+90.032204309" Apr 24 19:08:06.924693 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:06.924426 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-lp9kx" Apr 24 19:08:06.938810 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:06.938375 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-lp9kx" podStartSLOduration=2.155850767 podStartE2EDuration="19.938354816s" podCreationTimestamp="2026-04-24 19:07:47 +0000 UTC" firstStartedPulling="2026-04-24 19:07:48.049548547 +0000 UTC m=+71.161384870" lastFinishedPulling="2026-04-24 19:08:05.832052591 +0000 UTC m=+88.943888919" observedRunningTime="2026-04-24 19:08:06.936327373 +0000 UTC m=+90.048163719" watchObservedRunningTime="2026-04-24 19:08:06.938354816 +0000 UTC m=+90.050191162" Apr 24 19:08:07.914467 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:07.914422 2568 generic.go:358] "Generic (PLEG): container finished" podID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerID="7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d" exitCode=0 Apr 24 19:08:07.914878 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:07.914575 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerDied","Data":"7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d"} Apr 24 19:08:07.917351 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:07.917323 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" event={"ID":"ed502cab-a65d-4d95-91d4-aa59376937de","Type":"ContainerStarted","Data":"d3eae39622fad8a8a9a696b14d3680a1fe10017c929f6a6e1ecbd91f9f95020b"} Apr 24 19:08:07.978084 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:07.977979 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kq56v" podStartSLOduration=2.7610396919999998 podStartE2EDuration="10.97796461s" podCreationTimestamp="2026-04-24 19:07:57 +0000 UTC" firstStartedPulling="2026-04-24 19:07:58.734713858 +0000 UTC m=+81.846550198" lastFinishedPulling="2026-04-24 19:08:06.95163878 +0000 UTC m=+90.063475116" observedRunningTime="2026-04-24 19:08:07.976929024 +0000 UTC m=+91.088765394" watchObservedRunningTime="2026-04-24 19:08:07.97796461 +0000 UTC m=+91.089800956" Apr 24 19:08:08.128674 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.128633 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d98589767-gkz9h"] Apr 24 19:08:08.148127 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.148070 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d98589767-gkz9h"] Apr 24 19:08:08.148324 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.148255 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.151705 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.151641 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 19:08:08.151920 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.151905 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 19:08:08.152172 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.152154 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 19:08:08.152403 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.152366 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 19:08:08.152805 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.152787 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-d7g57\"" Apr 24 19:08:08.153001 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.152984 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 19:08:08.157254 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.157228 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 19:08:08.245582 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.245467 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-service-ca\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.245582 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.245524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-trusted-ca-bundle\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.245582 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.245553 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkcln\" (UniqueName: \"kubernetes.io/projected/aecc3819-8e2a-431a-a09b-fd1b63a32306-kube-api-access-bkcln\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.245872 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.245656 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-serving-cert\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.245872 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.245726 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-config\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.245872 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.245831 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-oauth-config\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.246028 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.245984 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-oauth-serving-cert\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.346948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.346901 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-service-ca\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.347198 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.346957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-trusted-ca-bundle\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.347198 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.346983 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkcln\" (UniqueName: \"kubernetes.io/projected/aecc3819-8e2a-431a-a09b-fd1b63a32306-kube-api-access-bkcln\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.347198 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.347096 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-serving-cert\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.347198 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.347141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-config\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.347198 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.347171 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-oauth-config\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.347484 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.347208 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-oauth-serving-cert\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.348166 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.347947 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-service-ca\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.348166 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.347950 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-oauth-serving-cert\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.348166 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.348015 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-trusted-ca-bundle\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.348813 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.348466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-config\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.350446 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.350379 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-serving-cert\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.350618 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.350590 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-oauth-config\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.366854 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.366818 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkcln\" (UniqueName: \"kubernetes.io/projected/aecc3819-8e2a-431a-a09b-fd1b63a32306-kube-api-access-bkcln\") pod \"console-d98589767-gkz9h\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:08.464557 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:08.464508 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:09.366652 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.366614 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5db89954c9-qmt95"] Apr 24 19:08:09.834459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.834369 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d98589767-gkz9h"] Apr 24 19:08:09.838430 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:08:09.838325 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaecc3819_8e2a_431a_a09b_fd1b63a32306.slice/crio-0eff9de8a2336152150a048e48fc0e55a8a0b818203ab4aee7d75f6c8d35be5f WatchSource:0}: Error finding container 0eff9de8a2336152150a048e48fc0e55a8a0b818203ab4aee7d75f6c8d35be5f: Status 404 returned error can't find the container with id 0eff9de8a2336152150a048e48fc0e55a8a0b818203ab4aee7d75f6c8d35be5f Apr 24 19:08:09.929326 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.929234 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" event={"ID":"ee37efac-17ac-4808-b8db-0df62be52e08","Type":"ContainerStarted","Data":"b61df884e892bc6118b7007699f4f389ec1192e30d0128362e5c0fbee48fa300"} Apr 24 19:08:09.931528 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.931491 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" event={"ID":"2c03d4b6-bd8b-48dd-8113-7e5008c145c1","Type":"ContainerStarted","Data":"dc3e601f0be74e49394c6cae8fbea337e95e8443d412b1f6a135fce8f8075785"} Apr 24 19:08:09.932853 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.932822 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d98589767-gkz9h" event={"ID":"aecc3819-8e2a-431a-a09b-fd1b63a32306","Type":"ContainerStarted","Data":"0eff9de8a2336152150a048e48fc0e55a8a0b818203ab4aee7d75f6c8d35be5f"} Apr 24 19:08:09.935249 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.935223 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" event={"ID":"6dd835bc-4aa2-4709-9346-da69bea29d70","Type":"ContainerStarted","Data":"a75624098cf1f3f45e5a2b6c6be55e447e5de563c66a06982985926c97d3b7ea"} Apr 24 19:08:09.935612 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.935569 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" Apr 24 19:08:09.942423 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.942397 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" Apr 24 19:08:09.953346 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.953269 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" podStartSLOduration=5.183996348 podStartE2EDuration="8.95325038s" podCreationTimestamp="2026-04-24 19:08:01 +0000 UTC" firstStartedPulling="2026-04-24 19:08:05.897542105 +0000 UTC m=+89.009378448" lastFinishedPulling="2026-04-24 19:08:09.666796149 +0000 UTC m=+92.778632480" observedRunningTime="2026-04-24 19:08:09.951719953 +0000 UTC m=+93.063556318" watchObservedRunningTime="2026-04-24 19:08:09.95325038 +0000 UTC m=+93.065086728" Apr 24 19:08:09.973236 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:09.973178 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4kktj" podStartSLOduration=5.242725203 podStartE2EDuration="8.973156638s" podCreationTimestamp="2026-04-24 19:08:01 +0000 UTC" firstStartedPulling="2026-04-24 19:08:05.935524527 +0000 UTC m=+89.047360866" lastFinishedPulling="2026-04-24 19:08:09.665955977 +0000 UTC m=+92.777792301" observedRunningTime="2026-04-24 19:08:09.97232108 +0000 UTC m=+93.084157428" watchObservedRunningTime="2026-04-24 19:08:09.973156638 +0000 UTC m=+93.084992985" Apr 24 19:08:10.947766 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:10.947719 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" event={"ID":"2c03d4b6-bd8b-48dd-8113-7e5008c145c1","Type":"ContainerStarted","Data":"3f2a9c37b10eb168a2f2d9dd3ca3d387919689b81af69879d0dd922f911024d4"} Apr 24 19:08:10.947766 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:10.947770 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" event={"ID":"2c03d4b6-bd8b-48dd-8113-7e5008c145c1","Type":"ContainerStarted","Data":"f82eee2103ef9e441a56a832a412aa3c29a2e8d628ffdcb70a9c8d2d893ab2bb"} Apr 24 19:08:11.956980 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:11.956927 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerStarted","Data":"d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2"} Apr 24 19:08:12.965410 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:12.965364 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerStarted","Data":"64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d"} Apr 24 19:08:13.971961 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:13.971795 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" event={"ID":"2c03d4b6-bd8b-48dd-8113-7e5008c145c1","Type":"ContainerStarted","Data":"a8c8392be3f517d2643e64110fbcb3592287949b5e9bdb14104eb3a949418b87"} Apr 24 19:08:13.971961 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:13.971843 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" event={"ID":"2c03d4b6-bd8b-48dd-8113-7e5008c145c1","Type":"ContainerStarted","Data":"bf7f9eea68ae818ef0f019ea3242680a9286f4a9b1835b646675d69f0f349976"} Apr 24 19:08:13.982964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:13.982922 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d98589767-gkz9h" event={"ID":"aecc3819-8e2a-431a-a09b-fd1b63a32306","Type":"ContainerStarted","Data":"a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754"} Apr 24 19:08:13.986417 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:13.986379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerStarted","Data":"5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4"} Apr 24 19:08:13.986516 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:13.986428 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerStarted","Data":"33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154"} Apr 24 19:08:14.009404 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:14.009153 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d98589767-gkz9h" podStartSLOduration=2.152955143 podStartE2EDuration="6.009138546s" podCreationTimestamp="2026-04-24 19:08:08 +0000 UTC" firstStartedPulling="2026-04-24 19:08:09.840754487 +0000 UTC m=+92.952590819" lastFinishedPulling="2026-04-24 19:08:13.696937897 +0000 UTC m=+96.808774222" observedRunningTime="2026-04-24 19:08:14.008374581 +0000 UTC m=+97.120210927" watchObservedRunningTime="2026-04-24 19:08:14.009138546 +0000 UTC m=+97.120974892" Apr 24 19:08:14.993847 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:14.993803 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" event={"ID":"2c03d4b6-bd8b-48dd-8113-7e5008c145c1","Type":"ContainerStarted","Data":"9aaf371217fe1cc2c9042b9b196640879d96b5a8d3773d7dd24ed9e27a0d60eb"} Apr 24 19:08:14.994314 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:14.994244 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:14.997610 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:14.997586 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerStarted","Data":"57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf"} Apr 24 19:08:14.997776 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:14.997615 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerStarted","Data":"a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d"} Apr 24 19:08:15.002091 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:15.002064 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" Apr 24 19:08:15.023721 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:15.023665 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-67f5cfbc9c-wfwqd" podStartSLOduration=7.246227462 podStartE2EDuration="15.023647822s" podCreationTimestamp="2026-04-24 19:08:00 +0000 UTC" firstStartedPulling="2026-04-24 19:08:05.918296072 +0000 UTC m=+89.030132400" lastFinishedPulling="2026-04-24 19:08:13.695716423 +0000 UTC m=+96.807552760" observedRunningTime="2026-04-24 19:08:15.020728793 +0000 UTC m=+98.132565150" watchObservedRunningTime="2026-04-24 19:08:15.023647822 +0000 UTC m=+98.135484171" Apr 24 19:08:15.050076 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:15.049717 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=11.280040352 podStartE2EDuration="17.049697741s" podCreationTimestamp="2026-04-24 19:07:58 +0000 UTC" firstStartedPulling="2026-04-24 19:08:05.715327231 +0000 UTC m=+88.827163557" lastFinishedPulling="2026-04-24 19:08:11.484984603 +0000 UTC m=+94.596820946" observedRunningTime="2026-04-24 19:08:15.047049962 +0000 UTC m=+98.158886333" watchObservedRunningTime="2026-04-24 19:08:15.049697741 +0000 UTC m=+98.161534087" Apr 24 19:08:18.465499 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:18.465450 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:18.465499 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:18.465508 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:18.470733 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:18.470705 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:19.015430 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:19.015384 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:08:22.102852 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:22.102811 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:22.102852 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:22.102853 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:34.393481 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.393411 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" podUID="cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" containerName="registry" containerID="cri-o://951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1" gracePeriod=30 Apr 24 19:08:34.633399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.633370 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:08:34.693497 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.693416 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-certificates\") pod \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " Apr 24 19:08:34.693497 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.693457 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-ca-trust-extracted\") pod \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " Apr 24 19:08:34.693497 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.693477 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-bound-sa-token\") pod \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " Apr 24 19:08:34.693497 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.693496 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-image-registry-private-configuration\") pod \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " Apr 24 19:08:34.693829 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.693526 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-trusted-ca\") pod \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " Apr 24 19:08:34.693829 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.693651 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-installation-pull-secrets\") pod \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " Apr 24 19:08:34.693829 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.693755 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") pod \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " Apr 24 19:08:34.693829 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.693794 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj5m8\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-kube-api-access-gj5m8\") pod \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\" (UID: \"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54\") " Apr 24 19:08:34.694226 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.694199 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:34.694347 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.694204 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:34.696118 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.696032 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:08:34.696410 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.696380 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:34.696515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.696413 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-kube-api-access-gj5m8" (OuterVolumeSpecName: "kube-api-access-gj5m8") pod "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54"). InnerVolumeSpecName "kube-api-access-gj5m8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:08:34.696515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.696485 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:08:34.696590 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.696504 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:34.702481 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.702456 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" (UID: "cb5283ff-9431-48cb-8ecc-ff6cc3c65c54"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:08:34.794674 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.794634 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-installation-pull-secrets\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:08:34.794674 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.794671 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:08:34.794674 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.794681 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gj5m8\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-kube-api-access-gj5m8\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:08:34.794908 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.794691 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-registry-certificates\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:08:34.794908 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.794701 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-ca-trust-extracted\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:08:34.794908 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.794710 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-bound-sa-token\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:08:34.794908 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.794720 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-image-registry-private-configuration\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:08:34.794908 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:34.794729 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54-trusted-ca\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:08:35.071292 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.071199 2568 generic.go:358] "Generic (PLEG): container finished" podID="cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" containerID="951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1" exitCode=0 Apr 24 19:08:35.071292 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.071260 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" Apr 24 19:08:35.071488 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.071283 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" event={"ID":"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54","Type":"ContainerDied","Data":"951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1"} Apr 24 19:08:35.071488 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.071322 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5db89954c9-qmt95" event={"ID":"cb5283ff-9431-48cb-8ecc-ff6cc3c65c54","Type":"ContainerDied","Data":"77524cc3565181fa93f00c2d0a7fe48ee639ea4d51805085e8a0fab224ba89f4"} Apr 24 19:08:35.071488 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.071337 2568 scope.go:117] "RemoveContainer" containerID="951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1" Apr 24 19:08:35.079928 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.079907 2568 scope.go:117] "RemoveContainer" containerID="951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1" Apr 24 19:08:35.080212 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:08:35.080190 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1\": container with ID starting with 951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1 not found: ID does not exist" containerID="951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1" Apr 24 19:08:35.080287 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.080219 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1"} err="failed to get container status \"951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1\": rpc error: code = NotFound desc = could not find container \"951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1\": container with ID starting with 951ee1e9f0266dd6d5c05dde2f5b9d6c88399202418566b1c9c0ced55c0a5ec1 not found: ID does not exist" Apr 24 19:08:35.112760 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.112725 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5db89954c9-qmt95"] Apr 24 19:08:35.132744 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.132707 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5db89954c9-qmt95"] Apr 24 19:08:35.459165 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:35.459130 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" path="/var/lib/kubelet/pods/cb5283ff-9431-48cb-8ecc-ff6cc3c65c54/volumes" Apr 24 19:08:42.108773 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:42.108741 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:42.112888 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:42.112860 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-58584969c7-cvcpc" Apr 24 19:08:43.098358 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:43.098320 2568 generic.go:358] "Generic (PLEG): container finished" podID="5d21b7cf-8c3d-459b-a502-f049a7353d9f" containerID="32fedfadf6a5107e9ff89e04813dd384af907cef9c715f861039fefc8eacdeab" exitCode=0 Apr 24 19:08:43.098534 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:43.098405 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bzksz" event={"ID":"5d21b7cf-8c3d-459b-a502-f049a7353d9f","Type":"ContainerDied","Data":"32fedfadf6a5107e9ff89e04813dd384af907cef9c715f861039fefc8eacdeab"} Apr 24 19:08:43.098782 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:43.098770 2568 scope.go:117] "RemoveContainer" containerID="32fedfadf6a5107e9ff89e04813dd384af907cef9c715f861039fefc8eacdeab" Apr 24 19:08:44.102917 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:44.102879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bzksz" event={"ID":"5d21b7cf-8c3d-459b-a502-f049a7353d9f","Type":"ContainerStarted","Data":"192cccbbac7f2f67dded14899ced7971b380c42e9a930824cd2d07a3a619d36a"} Apr 24 19:08:47.114236 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:47.114203 2568 generic.go:358] "Generic (PLEG): container finished" podID="feca8f75-d766-4f31-aa59-ba4ae692f026" containerID="c9e277f24a851a0e87bfd471c190f4586104c224cdf9f06a7bbd33a84823fbca" exitCode=0 Apr 24 19:08:47.114604 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:47.114277 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" event={"ID":"feca8f75-d766-4f31-aa59-ba4ae692f026","Type":"ContainerDied","Data":"c9e277f24a851a0e87bfd471c190f4586104c224cdf9f06a7bbd33a84823fbca"} Apr 24 19:08:47.114650 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:47.114617 2568 scope.go:117] "RemoveContainer" containerID="c9e277f24a851a0e87bfd471c190f4586104c224cdf9f06a7bbd33a84823fbca" Apr 24 19:08:48.119178 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:48.119139 2568 generic.go:358] "Generic (PLEG): container finished" podID="3a18605e-85a6-4562-acf8-4bef99990528" containerID="b9e2e9c87b758ba054515e3aa5fcef49d128a97b6de9eaff9bc566a7bf7a70c4" exitCode=0 Apr 24 19:08:48.119750 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:48.119213 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" event={"ID":"3a18605e-85a6-4562-acf8-4bef99990528","Type":"ContainerDied","Data":"b9e2e9c87b758ba054515e3aa5fcef49d128a97b6de9eaff9bc566a7bf7a70c4"} Apr 24 19:08:48.119750 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:48.119659 2568 scope.go:117] "RemoveContainer" containerID="b9e2e9c87b758ba054515e3aa5fcef49d128a97b6de9eaff9bc566a7bf7a70c4" Apr 24 19:08:48.120809 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:48.120783 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-49p5c" event={"ID":"feca8f75-d766-4f31-aa59-ba4ae692f026","Type":"ContainerStarted","Data":"1969c640049d1a6cc70614ad1550e139937e83e7f0d06b2a0e86c2927c123a72"} Apr 24 19:08:49.124912 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:08:49.124876 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2njp6" event={"ID":"3a18605e-85a6-4562-acf8-4bef99990528","Type":"ContainerStarted","Data":"1441b7cd85ea5084f50c0d79609972fa92291695393a36a3c10d61ef45d876b7"} Apr 24 19:09:17.522423 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:17.522385 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:09:18.221146 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:18.221068 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="alertmanager" containerID="cri-o://d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2" gracePeriod=120 Apr 24 19:09:18.221146 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:18.221137 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy-web" containerID="cri-o://33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154" gracePeriod=120 Apr 24 19:09:18.221415 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:18.221137 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy-metric" containerID="cri-o://a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d" gracePeriod=120 Apr 24 19:09:18.221415 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:18.221166 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="config-reloader" containerID="cri-o://64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d" gracePeriod=120 Apr 24 19:09:18.221415 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:18.221189 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="prom-label-proxy" containerID="cri-o://57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf" gracePeriod=120 Apr 24 19:09:18.221415 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:18.221212 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy" containerID="cri-o://5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4" gracePeriod=120 Apr 24 19:09:19.227675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227641 2568 generic.go:358] "Generic (PLEG): container finished" podID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerID="57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf" exitCode=0 Apr 24 19:09:19.227675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227667 2568 generic.go:358] "Generic (PLEG): container finished" podID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerID="a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d" exitCode=0 Apr 24 19:09:19.227675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227674 2568 generic.go:358] "Generic (PLEG): container finished" podID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerID="5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4" exitCode=0 Apr 24 19:09:19.227675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227680 2568 generic.go:358] "Generic (PLEG): container finished" podID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerID="64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d" exitCode=0 Apr 24 19:09:19.227675 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227684 2568 generic.go:358] "Generic (PLEG): container finished" podID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerID="d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2" exitCode=0 Apr 24 19:09:19.228209 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227713 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerDied","Data":"57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf"} Apr 24 19:09:19.228209 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227758 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerDied","Data":"a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d"} Apr 24 19:09:19.228209 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerDied","Data":"5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4"} Apr 24 19:09:19.228209 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227787 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerDied","Data":"64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d"} Apr 24 19:09:19.228209 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.227804 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerDied","Data":"d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2"} Apr 24 19:09:19.470195 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.470170 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:19.593553 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593459 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-metrics-client-ca\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593553 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593513 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-trusted-ca-bundle\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593563 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-web-config\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593589 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-main-tls\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593611 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-out\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593635 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-volume\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593656 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-main-db\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593688 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6njh\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-kube-api-access-g6njh\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593719 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-web\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593745 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.593790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593784 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-cluster-tls-config\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.594269 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593862 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.594269 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593909 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-tls-assets\") pod \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\" (UID: \"cc790792-524d-4dc1-b6a5-c6d6a75859e7\") " Apr 24 19:09:19.594269 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.593936 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:09:19.594269 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.594046 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:09:19.594499 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.594325 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:09:19.594640 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.594617 2568 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-metrics-client-ca\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.594806 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.594790 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.594904 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.594891 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-alertmanager-main-db\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.597062 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.596730 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-out" (OuterVolumeSpecName: "config-out") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:09:19.597332 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.597304 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:09:19.597685 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.597651 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:19.597780 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.597687 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:19.598268 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.598227 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:19.598578 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.598557 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:19.599070 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.599050 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-kube-api-access-g6njh" (OuterVolumeSpecName: "kube-api-access-g6njh") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "kube-api-access-g6njh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:09:19.599557 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.599541 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:19.602334 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.602303 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:19.608849 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.608824 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-web-config" (OuterVolumeSpecName: "web-config") pod "cc790792-524d-4dc1-b6a5-c6d6a75859e7" (UID: "cc790792-524d-4dc1-b6a5-c6d6a75859e7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:19.695399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695347 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-web-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.695399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695393 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-main-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.695399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695404 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-out\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.695399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695423 2568 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-config-volume\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.695692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695432 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g6njh\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-kube-api-access-g6njh\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.695692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695441 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.695692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695452 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.695692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695461 2568 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-cluster-tls-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.695692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695470 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cc790792-524d-4dc1-b6a5-c6d6a75859e7-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:19.695692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:19.695479 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cc790792-524d-4dc1-b6a5-c6d6a75859e7-tls-assets\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:09:20.233354 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.233310 2568 generic.go:358] "Generic (PLEG): container finished" podID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerID="33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154" exitCode=0 Apr 24 19:09:20.233772 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.233387 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerDied","Data":"33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154"} Apr 24 19:09:20.233772 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.233430 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cc790792-524d-4dc1-b6a5-c6d6a75859e7","Type":"ContainerDied","Data":"91994eaae0566a41450f53c4e30087c722f707b1950ffca73eb283f30843f029"} Apr 24 19:09:20.233772 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.233434 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.233772 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.233448 2568 scope.go:117] "RemoveContainer" containerID="57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf" Apr 24 19:09:20.241277 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.241195 2568 scope.go:117] "RemoveContainer" containerID="a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d" Apr 24 19:09:20.248471 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.248453 2568 scope.go:117] "RemoveContainer" containerID="5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4" Apr 24 19:09:20.255151 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.255130 2568 scope.go:117] "RemoveContainer" containerID="33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154" Apr 24 19:09:20.257870 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.257848 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:09:20.262779 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.262751 2568 scope.go:117] "RemoveContainer" containerID="64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d" Apr 24 19:09:20.263895 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.263872 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:09:20.270002 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.269983 2568 scope.go:117] "RemoveContainer" containerID="d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2" Apr 24 19:09:20.280938 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.280916 2568 scope.go:117] "RemoveContainer" containerID="7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d" Apr 24 19:09:20.288414 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.288246 2568 scope.go:117] "RemoveContainer" containerID="57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf" Apr 24 19:09:20.288552 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:09:20.288529 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf\": container with ID starting with 57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf not found: ID does not exist" containerID="57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf" Apr 24 19:09:20.288595 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.288562 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf"} err="failed to get container status \"57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf\": rpc error: code = NotFound desc = could not find container \"57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf\": container with ID starting with 57dbb265eb3d388f4d4f662ba63f500a1442caece05562e42210269a41ecbaaf not found: ID does not exist" Apr 24 19:09:20.288595 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.288583 2568 scope.go:117] "RemoveContainer" containerID="a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d" Apr 24 19:09:20.288854 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:09:20.288835 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d\": container with ID starting with a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d not found: ID does not exist" containerID="a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d" Apr 24 19:09:20.288912 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.288867 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d"} err="failed to get container status \"a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d\": rpc error: code = NotFound desc = could not find container \"a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d\": container with ID starting with a3a9a5f52b98eca7966b37263672cee412eae12da5ee122bb8d909248314908d not found: ID does not exist" Apr 24 19:09:20.288912 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.288891 2568 scope.go:117] "RemoveContainer" containerID="5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4" Apr 24 19:09:20.289208 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:09:20.289185 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4\": container with ID starting with 5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4 not found: ID does not exist" containerID="5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4" Apr 24 19:09:20.289264 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.289218 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4"} err="failed to get container status \"5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4\": rpc error: code = NotFound desc = could not find container \"5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4\": container with ID starting with 5e44890aa7d8855680fd8a858ff5663ac4fc083a8edf92e9030f3ee380c5c3c4 not found: ID does not exist" Apr 24 19:09:20.289264 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.289241 2568 scope.go:117] "RemoveContainer" containerID="33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154" Apr 24 19:09:20.289492 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:09:20.289472 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154\": container with ID starting with 33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154 not found: ID does not exist" containerID="33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154" Apr 24 19:09:20.289532 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.289497 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154"} err="failed to get container status \"33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154\": rpc error: code = NotFound desc = could not find container \"33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154\": container with ID starting with 33125c454d993b11be3a5405770ceb39f49be2f6beb82358ac96c4e1f9651154 not found: ID does not exist" Apr 24 19:09:20.289532 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.289513 2568 scope.go:117] "RemoveContainer" containerID="64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d" Apr 24 19:09:20.289747 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:09:20.289730 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d\": container with ID starting with 64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d not found: ID does not exist" containerID="64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d" Apr 24 19:09:20.289793 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.289751 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d"} err="failed to get container status \"64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d\": rpc error: code = NotFound desc = could not find container \"64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d\": container with ID starting with 64a2f821fe4e186815b19861f97d7bf839ea76849ec9b6bf19f0a04a197c479d not found: ID does not exist" Apr 24 19:09:20.289793 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.289766 2568 scope.go:117] "RemoveContainer" containerID="d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2" Apr 24 19:09:20.290000 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:09:20.289984 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2\": container with ID starting with d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2 not found: ID does not exist" containerID="d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2" Apr 24 19:09:20.290066 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.290008 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2"} err="failed to get container status \"d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2\": rpc error: code = NotFound desc = could not find container \"d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2\": container with ID starting with d62687e369890e89169178d047b84fdf51effcc77a433ca89bdfbcec41471ed2 not found: ID does not exist" Apr 24 19:09:20.290066 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.290029 2568 scope.go:117] "RemoveContainer" containerID="7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d" Apr 24 19:09:20.290270 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:09:20.290250 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d\": container with ID starting with 7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d not found: ID does not exist" containerID="7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d" Apr 24 19:09:20.290328 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.290276 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d"} err="failed to get container status \"7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d\": rpc error: code = NotFound desc = could not find container \"7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d\": container with ID starting with 7e360d74cb3184fd0098f73ec08040bbbda2b5d01a97ab1a1b69b3938072801d not found: ID does not exist" Apr 24 19:09:20.298592 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.298565 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:09:20.298935 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.298919 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="config-reloader" Apr 24 19:09:20.299009 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.298939 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="config-reloader" Apr 24 19:09:20.299009 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.298952 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="alertmanager" Apr 24 19:09:20.299009 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.298961 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="alertmanager" Apr 24 19:09:20.299009 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.298983 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy" Apr 24 19:09:20.299009 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.298991 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy" Apr 24 19:09:20.299009 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299001 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy-web" Apr 24 19:09:20.299009 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299009 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy-web" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299019 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="prom-label-proxy" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299026 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="prom-label-proxy" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299041 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy-metric" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299049 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy-metric" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299074 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" containerName="registry" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299082 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" containerName="registry" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299091 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="init-config-reloader" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299115 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="init-config-reloader" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299206 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy-web" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299219 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb5283ff-9431-48cb-8ecc-ff6cc3c65c54" containerName="registry" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299230 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="config-reloader" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299243 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy-metric" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299253 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="prom-label-proxy" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299263 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="alertmanager" Apr 24 19:09:20.299453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.299271 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" containerName="kube-rbac-proxy" Apr 24 19:09:20.306277 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.306251 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.309182 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.309150 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 19:09:20.309319 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.309223 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 19:09:20.309319 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.309290 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 19:09:20.309426 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.309333 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-zrwr2\"" Apr 24 19:09:20.309426 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.309328 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 19:09:20.309668 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.309651 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 19:09:20.309756 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.309681 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 19:09:20.309756 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.309686 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 19:09:20.309756 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.309703 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 19:09:20.317423 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.317400 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 19:09:20.318915 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.318891 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:09:20.402049 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402014 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402049 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402057 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-config-out\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402078 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402135 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402197 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402235 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402251 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402268 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-web-config\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402286 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402530 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402333 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402530 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402366 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zltjz\" (UniqueName: \"kubernetes.io/projected/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-kube-api-access-zltjz\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.402530 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.402415 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.503637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.503534 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.503637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.503592 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.503637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.503617 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-config-out\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.503637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.503635 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.503979 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.503657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.503979 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.503678 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.503979 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.503697 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.503979 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.503724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.503979 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.503751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.505169 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.504645 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-web-config\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.505169 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.504672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.505169 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.504698 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.505169 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.504709 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.505169 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.504742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.505169 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.504772 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zltjz\" (UniqueName: \"kubernetes.io/projected/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-kube-api-access-zltjz\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.505753 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.505725 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.507000 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.506861 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.507000 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.506913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-config-out\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.507000 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.506932 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.507521 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.507500 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.507607 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.507539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.507807 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.507789 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-web-config\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.507869 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.507858 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.508144 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.508125 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.508793 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.508778 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.513788 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.513767 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zltjz\" (UniqueName: \"kubernetes.io/projected/fd86bc9c-68eb-4f5a-a4a6-c34d485682b3-kube-api-access-zltjz\") pod \"alertmanager-main-0\" (UID: \"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.617643 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.617607 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:20.773195 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:20.773072 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:09:20.775929 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:09:20.775900 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd86bc9c_68eb_4f5a_a4a6_c34d485682b3.slice/crio-38b8990670d2d08b8ea680180bd1c981f62384447140c8de24d15966f23cf65a WatchSource:0}: Error finding container 38b8990670d2d08b8ea680180bd1c981f62384447140c8de24d15966f23cf65a: Status 404 returned error can't find the container with id 38b8990670d2d08b8ea680180bd1c981f62384447140c8de24d15966f23cf65a Apr 24 19:09:21.239462 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.239423 2568 generic.go:358] "Generic (PLEG): container finished" podID="fd86bc9c-68eb-4f5a-a4a6-c34d485682b3" containerID="6f35a412293d7a9d6ace40ffc8ac251b61d7f5b4ed5321e1123ea7573c0b6ca6" exitCode=0 Apr 24 19:09:21.239838 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.239505 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3","Type":"ContainerDied","Data":"6f35a412293d7a9d6ace40ffc8ac251b61d7f5b4ed5321e1123ea7573c0b6ca6"} Apr 24 19:09:21.239838 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.239543 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3","Type":"ContainerStarted","Data":"38b8990670d2d08b8ea680180bd1c981f62384447140c8de24d15966f23cf65a"} Apr 24 19:09:21.458748 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.458716 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc790792-524d-4dc1-b6a5-c6d6a75859e7" path="/var/lib/kubelet/pods/cc790792-524d-4dc1-b6a5-c6d6a75859e7/volumes" Apr 24 19:09:21.658118 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.658070 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-64769d95dd-hxrcw"] Apr 24 19:09:21.661647 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.661605 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.664349 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.664315 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 19:09:21.664782 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.664762 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 19:09:21.664782 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.664775 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 19:09:21.665011 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.664772 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-mlzm2\"" Apr 24 19:09:21.665011 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.664774 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 19:09:21.665011 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.664772 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 19:09:21.671735 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.671523 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 19:09:21.685035 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.685009 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64769d95dd-hxrcw"] Apr 24 19:09:21.816338 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.816235 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79eea6fb-9659-4b97-86f9-704b91e40d4b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.816338 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.816274 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-telemeter-client-tls\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.816338 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.816292 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjgjd\" (UniqueName: \"kubernetes.io/projected/79eea6fb-9659-4b97-86f9-704b91e40d4b-kube-api-access-tjgjd\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.816338 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.816313 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79eea6fb-9659-4b97-86f9-704b91e40d4b-serving-certs-ca-bundle\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.816637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.816398 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.816637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.816437 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79eea6fb-9659-4b97-86f9-704b91e40d4b-metrics-client-ca\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.816637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.816459 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-federate-client-tls\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.816637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.816489 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-secret-telemeter-client\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.917921 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.917882 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79eea6fb-9659-4b97-86f9-704b91e40d4b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.917921 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.917923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-telemeter-client-tls\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.918171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.917947 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjgjd\" (UniqueName: \"kubernetes.io/projected/79eea6fb-9659-4b97-86f9-704b91e40d4b-kube-api-access-tjgjd\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.918171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.917980 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79eea6fb-9659-4b97-86f9-704b91e40d4b-serving-certs-ca-bundle\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.918171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.918014 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.918171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.918041 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79eea6fb-9659-4b97-86f9-704b91e40d4b-metrics-client-ca\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.918171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.918074 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-federate-client-tls\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.918171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.918131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-secret-telemeter-client\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.918962 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.918931 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79eea6fb-9659-4b97-86f9-704b91e40d4b-serving-certs-ca-bundle\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.919093 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.918931 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79eea6fb-9659-4b97-86f9-704b91e40d4b-metrics-client-ca\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.919908 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.919883 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79eea6fb-9659-4b97-86f9-704b91e40d4b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.920940 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.920913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-secret-telemeter-client\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.921037 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.920964 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-federate-client-tls\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.921037 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.920991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-telemeter-client-tls\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.921037 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.921023 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79eea6fb-9659-4b97-86f9-704b91e40d4b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.926411 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.926392 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjgjd\" (UniqueName: \"kubernetes.io/projected/79eea6fb-9659-4b97-86f9-704b91e40d4b-kube-api-access-tjgjd\") pod \"telemeter-client-64769d95dd-hxrcw\" (UID: \"79eea6fb-9659-4b97-86f9-704b91e40d4b\") " pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:21.978440 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:21.978402 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" Apr 24 19:09:22.108581 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:22.108547 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64769d95dd-hxrcw"] Apr 24 19:09:22.111596 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:09:22.111567 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79eea6fb_9659_4b97_86f9_704b91e40d4b.slice/crio-5097804ebb1454167937e37e18b8d2b99c9e307cdf277d828954142bc690c542 WatchSource:0}: Error finding container 5097804ebb1454167937e37e18b8d2b99c9e307cdf277d828954142bc690c542: Status 404 returned error can't find the container with id 5097804ebb1454167937e37e18b8d2b99c9e307cdf277d828954142bc690c542 Apr 24 19:09:22.245284 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:22.245246 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" event={"ID":"79eea6fb-9659-4b97-86f9-704b91e40d4b","Type":"ContainerStarted","Data":"5097804ebb1454167937e37e18b8d2b99c9e307cdf277d828954142bc690c542"} Apr 24 19:09:22.248023 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:22.247989 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3","Type":"ContainerStarted","Data":"8dd1ed8d2207b7f92ad427ea743cb4ae7fd653b1935c6c4ad0e45c90718466e2"} Apr 24 19:09:22.248023 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:22.248020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3","Type":"ContainerStarted","Data":"28b155f4ac5ac41e3fcbb49da373f51709bdb27ccbf2d50bb05b82efe4811df8"} Apr 24 19:09:22.248210 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:22.248029 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3","Type":"ContainerStarted","Data":"dc9a6235bc4e8c427e89799de974848bcb9c1067194fd0ea066e6a7a54c5f242"} Apr 24 19:09:22.248210 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:22.248038 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3","Type":"ContainerStarted","Data":"1b51ff04d8eac45425ef29e5f0a64665488edc17152aada627681c1cb5ef95a0"} Apr 24 19:09:22.248210 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:22.248046 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3","Type":"ContainerStarted","Data":"bd32afddd24ebec6c9be77ec8786ce934f6470fe03f77cca9ffa227409ea5e9a"} Apr 24 19:09:22.248210 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:22.248054 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd86bc9c-68eb-4f5a-a4a6-c34d485682b3","Type":"ContainerStarted","Data":"8a175e721b32d28426d0cf81c36fcf4534a569caa4fd2f585ffa8d1dedacf888"} Apr 24 19:09:22.286819 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:22.286765 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.286749549 podStartE2EDuration="2.286749549s" podCreationTimestamp="2026-04-24 19:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:09:22.284889205 +0000 UTC m=+165.396725563" watchObservedRunningTime="2026-04-24 19:09:22.286749549 +0000 UTC m=+165.398585894" Apr 24 19:09:24.258674 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:24.258573 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" event={"ID":"79eea6fb-9659-4b97-86f9-704b91e40d4b","Type":"ContainerStarted","Data":"ecdaca759dccc4b013eb3fb080a7c45195d523ad3290d58863d9d789d9d211e5"} Apr 24 19:09:24.258674 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:24.258623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" event={"ID":"79eea6fb-9659-4b97-86f9-704b91e40d4b","Type":"ContainerStarted","Data":"db8a64ea3ab1274b5a9704172ac5644e0781065e8e1306fab55ca30f7f65657a"} Apr 24 19:09:24.258674 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:24.258634 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" event={"ID":"79eea6fb-9659-4b97-86f9-704b91e40d4b","Type":"ContainerStarted","Data":"5a00b8c8f506c9d2dd189f61225e09938c6e65acd408887820c83ddcaa5995a4"} Apr 24 19:09:24.290134 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:24.288336 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-64769d95dd-hxrcw" podStartSLOduration=1.427336843 podStartE2EDuration="3.288318023s" podCreationTimestamp="2026-04-24 19:09:21 +0000 UTC" firstStartedPulling="2026-04-24 19:09:22.113543148 +0000 UTC m=+165.225379472" lastFinishedPulling="2026-04-24 19:09:23.974524324 +0000 UTC m=+167.086360652" observedRunningTime="2026-04-24 19:09:24.284485683 +0000 UTC m=+167.396322028" watchObservedRunningTime="2026-04-24 19:09:24.288318023 +0000 UTC m=+167.400154370" Apr 24 19:09:24.902254 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:24.902218 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-df5cf5cbb-dd929"] Apr 24 19:09:24.905946 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:24.905919 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:24.918434 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:24.918408 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-df5cf5cbb-dd929"] Apr 24 19:09:25.053202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.053165 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-trusted-ca-bundle\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.053400 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.053210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-console-config\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.053400 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.053260 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-oauth-config\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.053400 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.053278 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-oauth-serving-cert\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.053400 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.053301 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-serving-cert\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.053584 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.053438 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-service-ca\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.053584 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.053477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47zz9\" (UniqueName: \"kubernetes.io/projected/228878a6-9dec-4a0a-b478-d07089d724fa-kube-api-access-47zz9\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.154686 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.154578 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-trusted-ca-bundle\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.154686 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.154631 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-console-config\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.154686 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.154664 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-oauth-config\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.154686 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.154688 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-oauth-serving-cert\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.155028 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.154719 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-serving-cert\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.155028 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.154792 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-service-ca\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.155028 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.154816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47zz9\" (UniqueName: \"kubernetes.io/projected/228878a6-9dec-4a0a-b478-d07089d724fa-kube-api-access-47zz9\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.155499 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.155471 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-console-config\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.155642 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.155618 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-oauth-serving-cert\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.155732 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.155652 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-service-ca\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.155857 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.155837 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-trusted-ca-bundle\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.157416 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.157386 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-oauth-config\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.157523 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.157485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-serving-cert\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.164370 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.164344 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47zz9\" (UniqueName: \"kubernetes.io/projected/228878a6-9dec-4a0a-b478-d07089d724fa-kube-api-access-47zz9\") pod \"console-df5cf5cbb-dd929\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.215355 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.215310 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:25.345267 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:25.345237 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-df5cf5cbb-dd929"] Apr 24 19:09:25.347618 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:09:25.347590 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod228878a6_9dec_4a0a_b478_d07089d724fa.slice/crio-96b0faa45f5581d086bbc460d498f5219596e4448f8cf9a750dae1ed0605ca29 WatchSource:0}: Error finding container 96b0faa45f5581d086bbc460d498f5219596e4448f8cf9a750dae1ed0605ca29: Status 404 returned error can't find the container with id 96b0faa45f5581d086bbc460d498f5219596e4448f8cf9a750dae1ed0605ca29 Apr 24 19:09:26.269902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:26.269863 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df5cf5cbb-dd929" event={"ID":"228878a6-9dec-4a0a-b478-d07089d724fa","Type":"ContainerStarted","Data":"0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd"} Apr 24 19:09:26.270083 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:26.269910 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df5cf5cbb-dd929" event={"ID":"228878a6-9dec-4a0a-b478-d07089d724fa","Type":"ContainerStarted","Data":"96b0faa45f5581d086bbc460d498f5219596e4448f8cf9a750dae1ed0605ca29"} Apr 24 19:09:26.295173 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:26.295095 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-df5cf5cbb-dd929" podStartSLOduration=2.29508147 podStartE2EDuration="2.29508147s" podCreationTimestamp="2026-04-24 19:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:09:26.294394972 +0000 UTC m=+169.406231319" watchObservedRunningTime="2026-04-24 19:09:26.29508147 +0000 UTC m=+169.406917816" Apr 24 19:09:35.215811 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:35.215768 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:35.216429 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:35.215974 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:35.220906 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:35.220884 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:35.306980 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:35.306952 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:09:35.375082 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:09:35.375053 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d98589767-gkz9h"] Apr 24 19:10:00.396458 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.396417 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d98589767-gkz9h" podUID="aecc3819-8e2a-431a-a09b-fd1b63a32306" containerName="console" containerID="cri-o://a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754" gracePeriod=15 Apr 24 19:10:00.644997 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.644974 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d98589767-gkz9h_aecc3819-8e2a-431a-a09b-fd1b63a32306/console/0.log" Apr 24 19:10:00.645137 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.645036 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:10:00.782442 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782353 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-config\") pod \"aecc3819-8e2a-431a-a09b-fd1b63a32306\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " Apr 24 19:10:00.782442 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782417 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-service-ca\") pod \"aecc3819-8e2a-431a-a09b-fd1b63a32306\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " Apr 24 19:10:00.782658 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782445 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-oauth-config\") pod \"aecc3819-8e2a-431a-a09b-fd1b63a32306\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " Apr 24 19:10:00.782658 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782491 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-oauth-serving-cert\") pod \"aecc3819-8e2a-431a-a09b-fd1b63a32306\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " Apr 24 19:10:00.782658 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782514 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-serving-cert\") pod \"aecc3819-8e2a-431a-a09b-fd1b63a32306\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " Apr 24 19:10:00.782658 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782538 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-trusted-ca-bundle\") pod \"aecc3819-8e2a-431a-a09b-fd1b63a32306\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " Apr 24 19:10:00.782658 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782560 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkcln\" (UniqueName: \"kubernetes.io/projected/aecc3819-8e2a-431a-a09b-fd1b63a32306-kube-api-access-bkcln\") pod \"aecc3819-8e2a-431a-a09b-fd1b63a32306\" (UID: \"aecc3819-8e2a-431a-a09b-fd1b63a32306\") " Apr 24 19:10:00.782910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782743 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-config" (OuterVolumeSpecName: "console-config") pod "aecc3819-8e2a-431a-a09b-fd1b63a32306" (UID: "aecc3819-8e2a-431a-a09b-fd1b63a32306"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:00.782910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782889 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:10:00.783013 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782952 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aecc3819-8e2a-431a-a09b-fd1b63a32306" (UID: "aecc3819-8e2a-431a-a09b-fd1b63a32306"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:00.783127 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.782995 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aecc3819-8e2a-431a-a09b-fd1b63a32306" (UID: "aecc3819-8e2a-431a-a09b-fd1b63a32306"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:00.783275 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.783254 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-service-ca" (OuterVolumeSpecName: "service-ca") pod "aecc3819-8e2a-431a-a09b-fd1b63a32306" (UID: "aecc3819-8e2a-431a-a09b-fd1b63a32306"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:00.784950 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.784916 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aecc3819-8e2a-431a-a09b-fd1b63a32306" (UID: "aecc3819-8e2a-431a-a09b-fd1b63a32306"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:10:00.784950 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.784931 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecc3819-8e2a-431a-a09b-fd1b63a32306-kube-api-access-bkcln" (OuterVolumeSpecName: "kube-api-access-bkcln") pod "aecc3819-8e2a-431a-a09b-fd1b63a32306" (UID: "aecc3819-8e2a-431a-a09b-fd1b63a32306"). InnerVolumeSpecName "kube-api-access-bkcln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:10:00.785143 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.785022 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aecc3819-8e2a-431a-a09b-fd1b63a32306" (UID: "aecc3819-8e2a-431a-a09b-fd1b63a32306"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:10:00.883510 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.883473 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-serving-cert\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:10:00.883510 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.883506 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-trusted-ca-bundle\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:10:00.883510 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.883517 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkcln\" (UniqueName: \"kubernetes.io/projected/aecc3819-8e2a-431a-a09b-fd1b63a32306-kube-api-access-bkcln\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:10:00.883752 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.883527 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-service-ca\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:10:00.883752 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.883537 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aecc3819-8e2a-431a-a09b-fd1b63a32306-console-oauth-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:10:00.883752 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:00.883545 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aecc3819-8e2a-431a-a09b-fd1b63a32306-oauth-serving-cert\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:10:01.393341 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.393311 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d98589767-gkz9h_aecc3819-8e2a-431a-a09b-fd1b63a32306/console/0.log" Apr 24 19:10:01.393522 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.393350 2568 generic.go:358] "Generic (PLEG): container finished" podID="aecc3819-8e2a-431a-a09b-fd1b63a32306" containerID="a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754" exitCode=2 Apr 24 19:10:01.393522 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.393399 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d98589767-gkz9h" event={"ID":"aecc3819-8e2a-431a-a09b-fd1b63a32306","Type":"ContainerDied","Data":"a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754"} Apr 24 19:10:01.393522 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.393415 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d98589767-gkz9h" Apr 24 19:10:01.393522 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.393430 2568 scope.go:117] "RemoveContainer" containerID="a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754" Apr 24 19:10:01.393522 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.393420 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d98589767-gkz9h" event={"ID":"aecc3819-8e2a-431a-a09b-fd1b63a32306","Type":"ContainerDied","Data":"0eff9de8a2336152150a048e48fc0e55a8a0b818203ab4aee7d75f6c8d35be5f"} Apr 24 19:10:01.401938 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.401830 2568 scope.go:117] "RemoveContainer" containerID="a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754" Apr 24 19:10:01.402199 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:10:01.402115 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754\": container with ID starting with a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754 not found: ID does not exist" containerID="a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754" Apr 24 19:10:01.402199 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.402143 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754"} err="failed to get container status \"a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754\": rpc error: code = NotFound desc = could not find container \"a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754\": container with ID starting with a199b097d68963d2b72dd854a7cf3a17a3232739e1dfdfb53e13b04613967754 not found: ID does not exist" Apr 24 19:10:01.415519 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.415490 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d98589767-gkz9h"] Apr 24 19:10:01.419256 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.419223 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d98589767-gkz9h"] Apr 24 19:10:01.459594 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:01.459563 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aecc3819-8e2a-431a-a09b-fd1b63a32306" path="/var/lib/kubelet/pods/aecc3819-8e2a-431a-a09b-fd1b63a32306/volumes" Apr 24 19:10:46.973604 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:46.973570 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64575dd578-ptvc5"] Apr 24 19:10:46.974161 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:46.973927 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aecc3819-8e2a-431a-a09b-fd1b63a32306" containerName="console" Apr 24 19:10:46.974161 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:46.973939 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecc3819-8e2a-431a-a09b-fd1b63a32306" containerName="console" Apr 24 19:10:46.974161 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:46.974020 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="aecc3819-8e2a-431a-a09b-fd1b63a32306" containerName="console" Apr 24 19:10:46.978357 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:46.978335 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:46.986929 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:46.986903 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64575dd578-ptvc5"] Apr 24 19:10:47.070407 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.070367 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-oauth-config\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.070407 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.070408 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-serving-cert\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.070647 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.070513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-service-ca\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.070647 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.070546 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltpvh\" (UniqueName: \"kubernetes.io/projected/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-kube-api-access-ltpvh\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.070647 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.070574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-config\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.070647 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.070608 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-trusted-ca-bundle\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.070783 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.070650 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-oauth-serving-cert\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.171576 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.171524 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-trusted-ca-bundle\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.171769 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.171586 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-oauth-serving-cert\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.171769 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.171626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-oauth-config\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.171769 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.171647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-serving-cert\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.171933 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.171856 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-service-ca\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.171933 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.171874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltpvh\" (UniqueName: \"kubernetes.io/projected/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-kube-api-access-ltpvh\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.171933 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.171891 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-config\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.172366 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.172339 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-oauth-serving-cert\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.172504 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.172472 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-trusted-ca-bundle\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.172637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.172539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-service-ca\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.172637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.172605 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-config\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.174679 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.174660 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-serving-cert\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.174774 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.174690 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-oauth-config\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.181224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.181196 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltpvh\" (UniqueName: \"kubernetes.io/projected/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-kube-api-access-ltpvh\") pod \"console-64575dd578-ptvc5\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.289353 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.289265 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:47.418213 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.418183 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64575dd578-ptvc5"] Apr 24 19:10:47.420603 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:10:47.420569 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b5271b5_6b8d_4534_bd7b_cc21e9a9b1b7.slice/crio-41912d178e74739f1eecd2bec449b40e6d9f0163dae445705545f0fe6de1bb30 WatchSource:0}: Error finding container 41912d178e74739f1eecd2bec449b40e6d9f0163dae445705545f0fe6de1bb30: Status 404 returned error can't find the container with id 41912d178e74739f1eecd2bec449b40e6d9f0163dae445705545f0fe6de1bb30 Apr 24 19:10:47.551760 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.551664 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64575dd578-ptvc5" event={"ID":"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7","Type":"ContainerStarted","Data":"3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5"} Apr 24 19:10:47.551760 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.551712 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64575dd578-ptvc5" event={"ID":"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7","Type":"ContainerStarted","Data":"41912d178e74739f1eecd2bec449b40e6d9f0163dae445705545f0fe6de1bb30"} Apr 24 19:10:47.570705 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:47.570655 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64575dd578-ptvc5" podStartSLOduration=1.570638293 podStartE2EDuration="1.570638293s" podCreationTimestamp="2026-04-24 19:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:10:47.570321607 +0000 UTC m=+250.682157977" watchObservedRunningTime="2026-04-24 19:10:47.570638293 +0000 UTC m=+250.682474639" Apr 24 19:10:57.289910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:57.289816 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:57.289910 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:57.289874 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:57.294746 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:57.294710 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:57.588458 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:57.588379 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:10:57.638216 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:10:57.638183 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-df5cf5cbb-dd929"] Apr 24 19:11:22.660344 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.660301 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-df5cf5cbb-dd929" podUID="228878a6-9dec-4a0a-b478-d07089d724fa" containerName="console" containerID="cri-o://0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd" gracePeriod=15 Apr 24 19:11:22.894687 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.894661 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df5cf5cbb-dd929_228878a6-9dec-4a0a-b478-d07089d724fa/console/0.log" Apr 24 19:11:22.894822 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.894724 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:11:22.986623 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.986520 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-oauth-serving-cert\") pod \"228878a6-9dec-4a0a-b478-d07089d724fa\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " Apr 24 19:11:22.986623 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.986613 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-oauth-config\") pod \"228878a6-9dec-4a0a-b478-d07089d724fa\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " Apr 24 19:11:22.986866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.986638 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47zz9\" (UniqueName: \"kubernetes.io/projected/228878a6-9dec-4a0a-b478-d07089d724fa-kube-api-access-47zz9\") pod \"228878a6-9dec-4a0a-b478-d07089d724fa\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " Apr 24 19:11:22.986866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.986685 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-trusted-ca-bundle\") pod \"228878a6-9dec-4a0a-b478-d07089d724fa\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " Apr 24 19:11:22.986866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.986712 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-serving-cert\") pod \"228878a6-9dec-4a0a-b478-d07089d724fa\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " Apr 24 19:11:22.986866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.986739 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-service-ca\") pod \"228878a6-9dec-4a0a-b478-d07089d724fa\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " Apr 24 19:11:22.986866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.986779 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-console-config\") pod \"228878a6-9dec-4a0a-b478-d07089d724fa\" (UID: \"228878a6-9dec-4a0a-b478-d07089d724fa\") " Apr 24 19:11:22.987226 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.987036 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "228878a6-9dec-4a0a-b478-d07089d724fa" (UID: "228878a6-9dec-4a0a-b478-d07089d724fa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:22.987226 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.987191 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "228878a6-9dec-4a0a-b478-d07089d724fa" (UID: "228878a6-9dec-4a0a-b478-d07089d724fa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:22.987324 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.987216 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-service-ca" (OuterVolumeSpecName: "service-ca") pod "228878a6-9dec-4a0a-b478-d07089d724fa" (UID: "228878a6-9dec-4a0a-b478-d07089d724fa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:22.987435 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.987402 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-console-config" (OuterVolumeSpecName: "console-config") pod "228878a6-9dec-4a0a-b478-d07089d724fa" (UID: "228878a6-9dec-4a0a-b478-d07089d724fa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:22.988955 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.988930 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "228878a6-9dec-4a0a-b478-d07089d724fa" (UID: "228878a6-9dec-4a0a-b478-d07089d724fa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:22.989047 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.988982 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228878a6-9dec-4a0a-b478-d07089d724fa-kube-api-access-47zz9" (OuterVolumeSpecName: "kube-api-access-47zz9") pod "228878a6-9dec-4a0a-b478-d07089d724fa" (UID: "228878a6-9dec-4a0a-b478-d07089d724fa"). InnerVolumeSpecName "kube-api-access-47zz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:11:22.989047 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:22.988981 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "228878a6-9dec-4a0a-b478-d07089d724fa" (UID: "228878a6-9dec-4a0a-b478-d07089d724fa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:23.087645 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.087591 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-oauth-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:11:23.087645 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.087638 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47zz9\" (UniqueName: \"kubernetes.io/projected/228878a6-9dec-4a0a-b478-d07089d724fa-kube-api-access-47zz9\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:11:23.087645 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.087649 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-trusted-ca-bundle\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:11:23.087645 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.087660 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/228878a6-9dec-4a0a-b478-d07089d724fa-console-serving-cert\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:11:23.087927 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.087670 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-service-ca\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:11:23.087927 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.087679 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-console-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:11:23.087927 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.087687 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/228878a6-9dec-4a0a-b478-d07089d724fa-oauth-serving-cert\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:11:23.677468 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.677437 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df5cf5cbb-dd929_228878a6-9dec-4a0a-b478-d07089d724fa/console/0.log" Apr 24 19:11:23.677868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.677483 2568 generic.go:358] "Generic (PLEG): container finished" podID="228878a6-9dec-4a0a-b478-d07089d724fa" containerID="0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd" exitCode=2 Apr 24 19:11:23.677868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.677587 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df5cf5cbb-dd929" Apr 24 19:11:23.677868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.677578 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df5cf5cbb-dd929" event={"ID":"228878a6-9dec-4a0a-b478-d07089d724fa","Type":"ContainerDied","Data":"0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd"} Apr 24 19:11:23.677868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.677703 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df5cf5cbb-dd929" event={"ID":"228878a6-9dec-4a0a-b478-d07089d724fa","Type":"ContainerDied","Data":"96b0faa45f5581d086bbc460d498f5219596e4448f8cf9a750dae1ed0605ca29"} Apr 24 19:11:23.677868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.677722 2568 scope.go:117] "RemoveContainer" containerID="0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd" Apr 24 19:11:23.686012 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.685992 2568 scope.go:117] "RemoveContainer" containerID="0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd" Apr 24 19:11:23.686330 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:11:23.686310 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd\": container with ID starting with 0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd not found: ID does not exist" containerID="0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd" Apr 24 19:11:23.686413 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.686337 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd"} err="failed to get container status \"0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd\": rpc error: code = NotFound desc = could not find container \"0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd\": container with ID starting with 0cfb46b3ab51657147bb6151ee17a9d6d4e772d8c2291c4d74e6260b7b8ebadd not found: ID does not exist" Apr 24 19:11:23.696426 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.696387 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-df5cf5cbb-dd929"] Apr 24 19:11:23.699592 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:23.699563 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-df5cf5cbb-dd929"] Apr 24 19:11:25.459477 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:25.459442 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228878a6-9dec-4a0a-b478-d07089d724fa" path="/var/lib/kubelet/pods/228878a6-9dec-4a0a-b478-d07089d724fa/volumes" Apr 24 19:11:37.348625 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:37.348598 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:11:37.349372 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:37.349351 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:11:37.352769 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:37.352750 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:11:37.353528 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:37.353511 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:11:37.357656 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:11:37.357638 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 19:12:54.594245 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.594148 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-mjx95"] Apr 24 19:12:54.594806 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.594675 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228878a6-9dec-4a0a-b478-d07089d724fa" containerName="console" Apr 24 19:12:54.594806 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.594694 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="228878a6-9dec-4a0a-b478-d07089d724fa" containerName="console" Apr 24 19:12:54.594926 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.594827 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="228878a6-9dec-4a0a-b478-d07089d724fa" containerName="console" Apr 24 19:12:54.597869 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.597846 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:54.600669 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.600643 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 19:12:54.600669 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.600663 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 19:12:54.601768 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.601752 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 19:12:54.601858 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.601787 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-9k6t5\"" Apr 24 19:12:54.606922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.606901 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-mjx95"] Apr 24 19:12:54.628805 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.628774 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-79f457656b-wvpkr"] Apr 24 19:12:54.632028 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.632008 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:12:54.634794 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.634771 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 19:12:54.634925 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.634771 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-f7gfl\"" Apr 24 19:12:54.642350 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.642319 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-79f457656b-wvpkr"] Apr 24 19:12:54.681837 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.681789 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert\") pod \"kserve-controller-manager-8cdbbc8b5-mjx95\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:54.681837 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.681839 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hll2\" (UniqueName: \"kubernetes.io/projected/a566da45-4e53-41ac-8b45-c306f77b43be-kube-api-access-6hll2\") pod \"kserve-controller-manager-8cdbbc8b5-mjx95\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:54.783293 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.783246 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a652e541-bdba-43ae-b7ff-5f89059836aa-data\") pod \"seaweedfs-79f457656b-wvpkr\" (UID: \"a652e541-bdba-43ae-b7ff-5f89059836aa\") " pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:12:54.783466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.783364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc958\" (UniqueName: \"kubernetes.io/projected/a652e541-bdba-43ae-b7ff-5f89059836aa-kube-api-access-sc958\") pod \"seaweedfs-79f457656b-wvpkr\" (UID: \"a652e541-bdba-43ae-b7ff-5f89059836aa\") " pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:12:54.783466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.783434 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert\") pod \"kserve-controller-manager-8cdbbc8b5-mjx95\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:54.783539 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.783475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hll2\" (UniqueName: \"kubernetes.io/projected/a566da45-4e53-41ac-8b45-c306f77b43be-kube-api-access-6hll2\") pod \"kserve-controller-manager-8cdbbc8b5-mjx95\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:54.783592 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:12:54.783577 2568 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 24 19:12:54.783650 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:12:54.783640 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert podName:a566da45-4e53-41ac-8b45-c306f77b43be nodeName:}" failed. No retries permitted until 2026-04-24 19:12:55.28361956 +0000 UTC m=+378.395455887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert") pod "kserve-controller-manager-8cdbbc8b5-mjx95" (UID: "a566da45-4e53-41ac-8b45-c306f77b43be") : secret "kserve-webhook-server-cert" not found Apr 24 19:12:54.792236 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.792203 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hll2\" (UniqueName: \"kubernetes.io/projected/a566da45-4e53-41ac-8b45-c306f77b43be-kube-api-access-6hll2\") pod \"kserve-controller-manager-8cdbbc8b5-mjx95\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:54.883952 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.883910 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a652e541-bdba-43ae-b7ff-5f89059836aa-data\") pod \"seaweedfs-79f457656b-wvpkr\" (UID: \"a652e541-bdba-43ae-b7ff-5f89059836aa\") " pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:12:54.884144 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.884009 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sc958\" (UniqueName: \"kubernetes.io/projected/a652e541-bdba-43ae-b7ff-5f89059836aa-kube-api-access-sc958\") pod \"seaweedfs-79f457656b-wvpkr\" (UID: \"a652e541-bdba-43ae-b7ff-5f89059836aa\") " pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:12:54.884396 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.884372 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a652e541-bdba-43ae-b7ff-5f89059836aa-data\") pod \"seaweedfs-79f457656b-wvpkr\" (UID: \"a652e541-bdba-43ae-b7ff-5f89059836aa\") " pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:12:54.892571 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.892541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc958\" (UniqueName: \"kubernetes.io/projected/a652e541-bdba-43ae-b7ff-5f89059836aa-kube-api-access-sc958\") pod \"seaweedfs-79f457656b-wvpkr\" (UID: \"a652e541-bdba-43ae-b7ff-5f89059836aa\") " pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:12:54.944604 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:54.944561 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:12:55.076266 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:55.076242 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-79f457656b-wvpkr"] Apr 24 19:12:55.078439 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:12:55.078410 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda652e541_bdba_43ae_b7ff_5f89059836aa.slice/crio-c0b3504ff9833b07555c752aea98095d568136fa64a3b7760db8df0c0fb63cd8 WatchSource:0}: Error finding container c0b3504ff9833b07555c752aea98095d568136fa64a3b7760db8df0c0fb63cd8: Status 404 returned error can't find the container with id c0b3504ff9833b07555c752aea98095d568136fa64a3b7760db8df0c0fb63cd8 Apr 24 19:12:55.080215 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:55.080193 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:12:55.287499 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:55.287407 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert\") pod \"kserve-controller-manager-8cdbbc8b5-mjx95\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:55.287645 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:12:55.287555 2568 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 24 19:12:55.287645 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:12:55.287619 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert podName:a566da45-4e53-41ac-8b45-c306f77b43be nodeName:}" failed. No retries permitted until 2026-04-24 19:12:56.287605252 +0000 UTC m=+379.399441581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert") pod "kserve-controller-manager-8cdbbc8b5-mjx95" (UID: "a566da45-4e53-41ac-8b45-c306f77b43be") : secret "kserve-webhook-server-cert" not found Apr 24 19:12:55.989443 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:55.989400 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-79f457656b-wvpkr" event={"ID":"a652e541-bdba-43ae-b7ff-5f89059836aa","Type":"ContainerStarted","Data":"c0b3504ff9833b07555c752aea98095d568136fa64a3b7760db8df0c0fb63cd8"} Apr 24 19:12:56.296803 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:56.296715 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert\") pod \"kserve-controller-manager-8cdbbc8b5-mjx95\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:56.299627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:56.299600 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert\") pod \"kserve-controller-manager-8cdbbc8b5-mjx95\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:56.410617 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:56.410575 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:12:56.541759 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:56.541711 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-mjx95"] Apr 24 19:12:56.544757 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:12:56.544724 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda566da45_4e53_41ac_8b45_c306f77b43be.slice/crio-53ee5975291f354426e600c02006d9711733decbf646a9d76adf3848d7c36194 WatchSource:0}: Error finding container 53ee5975291f354426e600c02006d9711733decbf646a9d76adf3848d7c36194: Status 404 returned error can't find the container with id 53ee5975291f354426e600c02006d9711733decbf646a9d76adf3848d7c36194 Apr 24 19:12:56.995940 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:12:56.995894 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" event={"ID":"a566da45-4e53-41ac-8b45-c306f77b43be","Type":"ContainerStarted","Data":"53ee5975291f354426e600c02006d9711733decbf646a9d76adf3848d7c36194"} Apr 24 19:13:00.017035 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:00.016995 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" event={"ID":"a566da45-4e53-41ac-8b45-c306f77b43be","Type":"ContainerStarted","Data":"bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c"} Apr 24 19:13:00.017646 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:00.017147 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:13:00.018541 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:00.018521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-79f457656b-wvpkr" event={"ID":"a652e541-bdba-43ae-b7ff-5f89059836aa","Type":"ContainerStarted","Data":"c2b54f2f6d56aade1f9d549cf84230a1c1f60a43462581c591119ebd48357f92"} Apr 24 19:13:00.018635 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:00.018588 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:13:00.035352 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:00.035299 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" podStartSLOduration=2.984923576 podStartE2EDuration="6.03528494s" podCreationTimestamp="2026-04-24 19:12:54 +0000 UTC" firstStartedPulling="2026-04-24 19:12:56.546706197 +0000 UTC m=+379.658542522" lastFinishedPulling="2026-04-24 19:12:59.597067563 +0000 UTC m=+382.708903886" observedRunningTime="2026-04-24 19:13:00.034385013 +0000 UTC m=+383.146221358" watchObservedRunningTime="2026-04-24 19:13:00.03528494 +0000 UTC m=+383.147121285" Apr 24 19:13:00.051357 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:00.051304 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-79f457656b-wvpkr" podStartSLOduration=1.486137249 podStartE2EDuration="6.051288752s" podCreationTimestamp="2026-04-24 19:12:54 +0000 UTC" firstStartedPulling="2026-04-24 19:12:55.080381532 +0000 UTC m=+378.192217870" lastFinishedPulling="2026-04-24 19:12:59.645533031 +0000 UTC m=+382.757369373" observedRunningTime="2026-04-24 19:13:00.050361683 +0000 UTC m=+383.162198028" watchObservedRunningTime="2026-04-24 19:13:00.051288752 +0000 UTC m=+383.163125079" Apr 24 19:13:06.024012 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:06.023975 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-79f457656b-wvpkr" Apr 24 19:13:30.361542 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.361502 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-mjx95"] Apr 24 19:13:30.362090 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.361738 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" podUID="a566da45-4e53-41ac-8b45-c306f77b43be" containerName="manager" containerID="cri-o://bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c" gracePeriod=10 Apr 24 19:13:30.366542 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.366513 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:13:30.381381 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.381345 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-rzvsk"] Apr 24 19:13:30.384677 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.384658 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:13:30.393824 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.393792 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-rzvsk"] Apr 24 19:13:30.508450 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.508413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdfad2de-4a6e-4138-a96d-89aed3ea342f-cert\") pod \"kserve-controller-manager-8cdbbc8b5-rzvsk\" (UID: \"cdfad2de-4a6e-4138-a96d-89aed3ea342f\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:13:30.508637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.508483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4gh\" (UniqueName: \"kubernetes.io/projected/cdfad2de-4a6e-4138-a96d-89aed3ea342f-kube-api-access-gq4gh\") pod \"kserve-controller-manager-8cdbbc8b5-rzvsk\" (UID: \"cdfad2de-4a6e-4138-a96d-89aed3ea342f\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:13:30.598459 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.598434 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:13:30.609445 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.609416 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdfad2de-4a6e-4138-a96d-89aed3ea342f-cert\") pod \"kserve-controller-manager-8cdbbc8b5-rzvsk\" (UID: \"cdfad2de-4a6e-4138-a96d-89aed3ea342f\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:13:30.609570 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.609473 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4gh\" (UniqueName: \"kubernetes.io/projected/cdfad2de-4a6e-4138-a96d-89aed3ea342f-kube-api-access-gq4gh\") pod \"kserve-controller-manager-8cdbbc8b5-rzvsk\" (UID: \"cdfad2de-4a6e-4138-a96d-89aed3ea342f\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:13:30.611784 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.611726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdfad2de-4a6e-4138-a96d-89aed3ea342f-cert\") pod \"kserve-controller-manager-8cdbbc8b5-rzvsk\" (UID: \"cdfad2de-4a6e-4138-a96d-89aed3ea342f\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:13:30.621952 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.621921 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4gh\" (UniqueName: \"kubernetes.io/projected/cdfad2de-4a6e-4138-a96d-89aed3ea342f-kube-api-access-gq4gh\") pod \"kserve-controller-manager-8cdbbc8b5-rzvsk\" (UID: \"cdfad2de-4a6e-4138-a96d-89aed3ea342f\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:13:30.710866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.710834 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert\") pod \"a566da45-4e53-41ac-8b45-c306f77b43be\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " Apr 24 19:13:30.710866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.710874 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hll2\" (UniqueName: \"kubernetes.io/projected/a566da45-4e53-41ac-8b45-c306f77b43be-kube-api-access-6hll2\") pod \"a566da45-4e53-41ac-8b45-c306f77b43be\" (UID: \"a566da45-4e53-41ac-8b45-c306f77b43be\") " Apr 24 19:13:30.713077 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.713044 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert" (OuterVolumeSpecName: "cert") pod "a566da45-4e53-41ac-8b45-c306f77b43be" (UID: "a566da45-4e53-41ac-8b45-c306f77b43be"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:13:30.713214 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.713087 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a566da45-4e53-41ac-8b45-c306f77b43be-kube-api-access-6hll2" (OuterVolumeSpecName: "kube-api-access-6hll2") pod "a566da45-4e53-41ac-8b45-c306f77b43be" (UID: "a566da45-4e53-41ac-8b45-c306f77b43be"). InnerVolumeSpecName "kube-api-access-6hll2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:13:30.735338 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.735308 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:13:30.811766 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.811708 2568 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a566da45-4e53-41ac-8b45-c306f77b43be-cert\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:13:30.811766 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.811748 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hll2\" (UniqueName: \"kubernetes.io/projected/a566da45-4e53-41ac-8b45-c306f77b43be-kube-api-access-6hll2\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:13:30.860699 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:30.860668 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-rzvsk"] Apr 24 19:13:30.863178 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:13:30.863115 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdfad2de_4a6e_4138_a96d_89aed3ea342f.slice/crio-93286619e930967df2a9b1a89e83e64652e58ff426fa2ddf44d555089e5794ca WatchSource:0}: Error finding container 93286619e930967df2a9b1a89e83e64652e58ff426fa2ddf44d555089e5794ca: Status 404 returned error can't find the container with id 93286619e930967df2a9b1a89e83e64652e58ff426fa2ddf44d555089e5794ca Apr 24 19:13:31.124566 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.124464 2568 generic.go:358] "Generic (PLEG): container finished" podID="a566da45-4e53-41ac-8b45-c306f77b43be" containerID="bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c" exitCode=0 Apr 24 19:13:31.124566 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.124561 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" Apr 24 19:13:31.124778 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.124554 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" event={"ID":"a566da45-4e53-41ac-8b45-c306f77b43be","Type":"ContainerDied","Data":"bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c"} Apr 24 19:13:31.124778 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.124664 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-mjx95" event={"ID":"a566da45-4e53-41ac-8b45-c306f77b43be","Type":"ContainerDied","Data":"53ee5975291f354426e600c02006d9711733decbf646a9d76adf3848d7c36194"} Apr 24 19:13:31.124778 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.124686 2568 scope.go:117] "RemoveContainer" containerID="bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c" Apr 24 19:13:31.125680 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.125658 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" event={"ID":"cdfad2de-4a6e-4138-a96d-89aed3ea342f","Type":"ContainerStarted","Data":"93286619e930967df2a9b1a89e83e64652e58ff426fa2ddf44d555089e5794ca"} Apr 24 19:13:31.132823 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.132803 2568 scope.go:117] "RemoveContainer" containerID="bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c" Apr 24 19:13:31.133160 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:13:31.133136 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c\": container with ID starting with bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c not found: ID does not exist" containerID="bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c" Apr 24 19:13:31.133232 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.133168 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c"} err="failed to get container status \"bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c\": rpc error: code = NotFound desc = could not find container \"bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c\": container with ID starting with bf5e0ae36b75b95745f4664c201508099b339aad418f3058973e8736056bec4c not found: ID does not exist" Apr 24 19:13:31.146616 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.146581 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-mjx95"] Apr 24 19:13:31.149681 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.149655 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-mjx95"] Apr 24 19:13:31.460513 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:31.460474 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a566da45-4e53-41ac-8b45-c306f77b43be" path="/var/lib/kubelet/pods/a566da45-4e53-41ac-8b45-c306f77b43be/volumes" Apr 24 19:13:32.131736 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:32.131699 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" event={"ID":"cdfad2de-4a6e-4138-a96d-89aed3ea342f","Type":"ContainerStarted","Data":"db4b012a38e4d5134829f7e8a15873b582657fabd5902fa626e4278fb1f570c7"} Apr 24 19:13:32.131897 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:32.131757 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:13:32.156563 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:13:32.156506 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" podStartSLOduration=1.77592259 podStartE2EDuration="2.156489538s" podCreationTimestamp="2026-04-24 19:13:30 +0000 UTC" firstStartedPulling="2026-04-24 19:13:30.86443286 +0000 UTC m=+413.976269185" lastFinishedPulling="2026-04-24 19:13:31.244999806 +0000 UTC m=+414.356836133" observedRunningTime="2026-04-24 19:13:32.15622936 +0000 UTC m=+415.268065706" watchObservedRunningTime="2026-04-24 19:13:32.156489538 +0000 UTC m=+415.268325884" Apr 24 19:14:03.140589 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:03.140507 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-8cdbbc8b5-rzvsk" Apr 24 19:14:04.008365 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.008333 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-khsnm"] Apr 24 19:14:04.008780 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.008763 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a566da45-4e53-41ac-8b45-c306f77b43be" containerName="manager" Apr 24 19:14:04.008859 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.008783 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a566da45-4e53-41ac-8b45-c306f77b43be" containerName="manager" Apr 24 19:14:04.008902 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.008882 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a566da45-4e53-41ac-8b45-c306f77b43be" containerName="manager" Apr 24 19:14:04.012005 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.011988 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:04.015891 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.015863 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 19:14:04.015891 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.015878 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-d68h7\"" Apr 24 19:14:04.026231 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.026204 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-khsnm"] Apr 24 19:14:04.032592 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.032565 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-8pt7p"] Apr 24 19:14:04.035867 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.035848 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:04.039205 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.039184 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-kl267\"" Apr 24 19:14:04.039536 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.039519 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 19:14:04.064410 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.064378 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8pt7p"] Apr 24 19:14:04.102994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.102954 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9006ada-2f28-45d7-8189-31eb6ecc099e-tls-certs\") pod \"model-serving-api-86f7b4b499-khsnm\" (UID: \"e9006ada-2f28-45d7-8189-31eb6ecc099e\") " pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:04.103210 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.103040 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v77p5\" (UniqueName: \"kubernetes.io/projected/e9006ada-2f28-45d7-8189-31eb6ecc099e-kube-api-access-v77p5\") pod \"model-serving-api-86f7b4b499-khsnm\" (UID: \"e9006ada-2f28-45d7-8189-31eb6ecc099e\") " pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:04.103210 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.103072 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/922f7ac3-263a-4a81-b9e9-3ef5c1024192-cert\") pod \"odh-model-controller-696fc77849-8pt7p\" (UID: \"922f7ac3-263a-4a81-b9e9-3ef5c1024192\") " pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:04.103210 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.103094 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jll4m\" (UniqueName: \"kubernetes.io/projected/922f7ac3-263a-4a81-b9e9-3ef5c1024192-kube-api-access-jll4m\") pod \"odh-model-controller-696fc77849-8pt7p\" (UID: \"922f7ac3-263a-4a81-b9e9-3ef5c1024192\") " pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:04.203584 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.203526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9006ada-2f28-45d7-8189-31eb6ecc099e-tls-certs\") pod \"model-serving-api-86f7b4b499-khsnm\" (UID: \"e9006ada-2f28-45d7-8189-31eb6ecc099e\") " pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:04.204054 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.203617 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v77p5\" (UniqueName: \"kubernetes.io/projected/e9006ada-2f28-45d7-8189-31eb6ecc099e-kube-api-access-v77p5\") pod \"model-serving-api-86f7b4b499-khsnm\" (UID: \"e9006ada-2f28-45d7-8189-31eb6ecc099e\") " pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:04.204054 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.203642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/922f7ac3-263a-4a81-b9e9-3ef5c1024192-cert\") pod \"odh-model-controller-696fc77849-8pt7p\" (UID: \"922f7ac3-263a-4a81-b9e9-3ef5c1024192\") " pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:04.204054 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.203661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jll4m\" (UniqueName: \"kubernetes.io/projected/922f7ac3-263a-4a81-b9e9-3ef5c1024192-kube-api-access-jll4m\") pod \"odh-model-controller-696fc77849-8pt7p\" (UID: \"922f7ac3-263a-4a81-b9e9-3ef5c1024192\") " pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:04.204054 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:14:04.203793 2568 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 19:14:04.204054 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:14:04.203858 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/922f7ac3-263a-4a81-b9e9-3ef5c1024192-cert podName:922f7ac3-263a-4a81-b9e9-3ef5c1024192 nodeName:}" failed. No retries permitted until 2026-04-24 19:14:04.703840886 +0000 UTC m=+447.815677209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/922f7ac3-263a-4a81-b9e9-3ef5c1024192-cert") pod "odh-model-controller-696fc77849-8pt7p" (UID: "922f7ac3-263a-4a81-b9e9-3ef5c1024192") : secret "odh-model-controller-webhook-cert" not found Apr 24 19:14:04.205984 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.205962 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9006ada-2f28-45d7-8189-31eb6ecc099e-tls-certs\") pod \"model-serving-api-86f7b4b499-khsnm\" (UID: \"e9006ada-2f28-45d7-8189-31eb6ecc099e\") " pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:04.212372 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.212351 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v77p5\" (UniqueName: \"kubernetes.io/projected/e9006ada-2f28-45d7-8189-31eb6ecc099e-kube-api-access-v77p5\") pod \"model-serving-api-86f7b4b499-khsnm\" (UID: \"e9006ada-2f28-45d7-8189-31eb6ecc099e\") " pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:04.212725 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.212708 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jll4m\" (UniqueName: \"kubernetes.io/projected/922f7ac3-263a-4a81-b9e9-3ef5c1024192-kube-api-access-jll4m\") pod \"odh-model-controller-696fc77849-8pt7p\" (UID: \"922f7ac3-263a-4a81-b9e9-3ef5c1024192\") " pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:04.322796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.322711 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:04.452849 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.452821 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-khsnm"] Apr 24 19:14:04.455719 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:14:04.455684 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9006ada_2f28_45d7_8189_31eb6ecc099e.slice/crio-1feece17641d0da7f006c2c7d1f865bc6fc9201428c4958a6634c75387e65692 WatchSource:0}: Error finding container 1feece17641d0da7f006c2c7d1f865bc6fc9201428c4958a6634c75387e65692: Status 404 returned error can't find the container with id 1feece17641d0da7f006c2c7d1f865bc6fc9201428c4958a6634c75387e65692 Apr 24 19:14:04.709624 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.709585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/922f7ac3-263a-4a81-b9e9-3ef5c1024192-cert\") pod \"odh-model-controller-696fc77849-8pt7p\" (UID: \"922f7ac3-263a-4a81-b9e9-3ef5c1024192\") " pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:04.711969 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.711947 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/922f7ac3-263a-4a81-b9e9-3ef5c1024192-cert\") pod \"odh-model-controller-696fc77849-8pt7p\" (UID: \"922f7ac3-263a-4a81-b9e9-3ef5c1024192\") " pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:04.946293 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:04.946247 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:05.109202 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:05.109067 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8pt7p"] Apr 24 19:14:05.111563 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:14:05.111535 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod922f7ac3_263a_4a81_b9e9_3ef5c1024192.slice/crio-6fc63baa9c8e673015bdf76d53af538c9674b5efb0a9cc80d4bd7e232cb847a7 WatchSource:0}: Error finding container 6fc63baa9c8e673015bdf76d53af538c9674b5efb0a9cc80d4bd7e232cb847a7: Status 404 returned error can't find the container with id 6fc63baa9c8e673015bdf76d53af538c9674b5efb0a9cc80d4bd7e232cb847a7 Apr 24 19:14:05.250878 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:05.250835 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8pt7p" event={"ID":"922f7ac3-263a-4a81-b9e9-3ef5c1024192","Type":"ContainerStarted","Data":"6fc63baa9c8e673015bdf76d53af538c9674b5efb0a9cc80d4bd7e232cb847a7"} Apr 24 19:14:05.252171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:05.252133 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-khsnm" event={"ID":"e9006ada-2f28-45d7-8189-31eb6ecc099e","Type":"ContainerStarted","Data":"1feece17641d0da7f006c2c7d1f865bc6fc9201428c4958a6634c75387e65692"} Apr 24 19:14:06.258539 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:06.258493 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-khsnm" event={"ID":"e9006ada-2f28-45d7-8189-31eb6ecc099e","Type":"ContainerStarted","Data":"8623fdea60cd0d0b8df3d422b41db791be2efd82947554e1850652614cf12700"} Apr 24 19:14:06.258985 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:06.258898 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:06.279646 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:06.279580 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-khsnm" podStartSLOduration=1.9993417230000001 podStartE2EDuration="3.279557626s" podCreationTimestamp="2026-04-24 19:14:03 +0000 UTC" firstStartedPulling="2026-04-24 19:14:04.457413166 +0000 UTC m=+447.569249489" lastFinishedPulling="2026-04-24 19:14:05.73762905 +0000 UTC m=+448.849465392" observedRunningTime="2026-04-24 19:14:06.276922111 +0000 UTC m=+449.388758457" watchObservedRunningTime="2026-04-24 19:14:06.279557626 +0000 UTC m=+449.391393974" Apr 24 19:14:08.267330 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:08.267230 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8pt7p" event={"ID":"922f7ac3-263a-4a81-b9e9-3ef5c1024192","Type":"ContainerStarted","Data":"f8004cf9da804da7207217982da122d6631892accd2a7d8c20b4badca5a5ef04"} Apr 24 19:14:08.267719 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:08.267353 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:08.286766 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:08.286682 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-8pt7p" podStartSLOduration=1.397969261 podStartE2EDuration="4.286661591s" podCreationTimestamp="2026-04-24 19:14:04 +0000 UTC" firstStartedPulling="2026-04-24 19:14:05.11316422 +0000 UTC m=+448.225000544" lastFinishedPulling="2026-04-24 19:14:08.001856536 +0000 UTC m=+451.113692874" observedRunningTime="2026-04-24 19:14:08.285323701 +0000 UTC m=+451.397160047" watchObservedRunningTime="2026-04-24 19:14:08.286661591 +0000 UTC m=+451.398497938" Apr 24 19:14:17.266582 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:17.266544 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-khsnm" Apr 24 19:14:19.273677 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:19.273640 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-8pt7p" Apr 24 19:14:27.406249 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.406203 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9c8765b65-dhrt8"] Apr 24 19:14:27.410162 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.410129 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.421941 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.421910 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c8765b65-dhrt8"] Apr 24 19:14:27.529784 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.529745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-trusted-ca-bundle\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.529964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.529795 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49aa9000-7dea-4c2e-aeae-918f0ad7936b-console-serving-cert\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.529964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.529816 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49aa9000-7dea-4c2e-aeae-918f0ad7936b-console-oauth-config\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.529964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.529886 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-service-ca\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.529964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.529915 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-oauth-serving-cert\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.529964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.529956 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g678q\" (UniqueName: \"kubernetes.io/projected/49aa9000-7dea-4c2e-aeae-918f0ad7936b-kube-api-access-g678q\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.530165 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.530019 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-console-config\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.630718 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.630674 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-console-config\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.630894 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.630765 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-trusted-ca-bundle\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.630894 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.630806 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49aa9000-7dea-4c2e-aeae-918f0ad7936b-console-serving-cert\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.630894 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.630832 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49aa9000-7dea-4c2e-aeae-918f0ad7936b-console-oauth-config\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.630894 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.630875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-service-ca\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.631070 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.630896 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-oauth-serving-cert\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.631070 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.630932 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g678q\" (UniqueName: \"kubernetes.io/projected/49aa9000-7dea-4c2e-aeae-918f0ad7936b-kube-api-access-g678q\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.631627 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.631598 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-console-config\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.631742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.631623 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-service-ca\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.631782 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.631761 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-oauth-serving-cert\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.631817 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.631800 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49aa9000-7dea-4c2e-aeae-918f0ad7936b-trusted-ca-bundle\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.633440 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.633420 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49aa9000-7dea-4c2e-aeae-918f0ad7936b-console-serving-cert\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.633534 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.633444 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49aa9000-7dea-4c2e-aeae-918f0ad7936b-console-oauth-config\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.639556 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.639533 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g678q\" (UniqueName: \"kubernetes.io/projected/49aa9000-7dea-4c2e-aeae-918f0ad7936b-kube-api-access-g678q\") pod \"console-9c8765b65-dhrt8\" (UID: \"49aa9000-7dea-4c2e-aeae-918f0ad7936b\") " pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.722646 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.722557 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:27.857647 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:27.857613 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c8765b65-dhrt8"] Apr 24 19:14:27.861752 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:14:27.861719 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49aa9000_7dea_4c2e_aeae_918f0ad7936b.slice/crio-7a59b1a5b777300b790f23033c8448b106f9c08492eee146f1422145c1569439 WatchSource:0}: Error finding container 7a59b1a5b777300b790f23033c8448b106f9c08492eee146f1422145c1569439: Status 404 returned error can't find the container with id 7a59b1a5b777300b790f23033c8448b106f9c08492eee146f1422145c1569439 Apr 24 19:14:28.342123 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:28.342057 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c8765b65-dhrt8" event={"ID":"49aa9000-7dea-4c2e-aeae-918f0ad7936b","Type":"ContainerStarted","Data":"f4ee52e61a74386cd4774fc06ef1ee46aeb329470193cb7af46ca0c0dfbb16c3"} Apr 24 19:14:28.342123 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:28.342095 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c8765b65-dhrt8" event={"ID":"49aa9000-7dea-4c2e-aeae-918f0ad7936b","Type":"ContainerStarted","Data":"7a59b1a5b777300b790f23033c8448b106f9c08492eee146f1422145c1569439"} Apr 24 19:14:37.723078 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:37.723036 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:37.723078 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:37.723086 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:37.727773 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:37.727747 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:37.745984 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:37.745939 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9c8765b65-dhrt8" podStartSLOduration=10.745922315 podStartE2EDuration="10.745922315s" podCreationTimestamp="2026-04-24 19:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:14:28.374909568 +0000 UTC m=+471.486745914" watchObservedRunningTime="2026-04-24 19:14:37.745922315 +0000 UTC m=+480.857758661" Apr 24 19:14:38.382456 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:38.382424 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9c8765b65-dhrt8" Apr 24 19:14:38.433276 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:38.433234 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64575dd578-ptvc5"] Apr 24 19:14:39.677083 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.677040 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w"] Apr 24 19:14:39.685271 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.685243 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.688008 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.687979 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-56c17-predictor-serving-cert\"" Apr 24 19:14:39.688193 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.688011 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 19:14:39.688193 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.688011 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 19:14:39.688193 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.688060 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\"" Apr 24 19:14:39.689287 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.689268 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pjv9w\"" Apr 24 19:14:39.692442 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.692420 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w"] Apr 24 19:14:39.740680 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.740642 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.740863 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.740702 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.740863 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.740757 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhwf2\" (UniqueName: \"kubernetes.io/projected/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kube-api-access-bhwf2\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.740964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.740867 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-proxy-tls\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.841798 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.841761 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-proxy-tls\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.842004 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.841824 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.842004 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.841856 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.842004 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.841886 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhwf2\" (UniqueName: \"kubernetes.io/projected/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kube-api-access-bhwf2\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.842385 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.842361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.842610 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.842584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.844397 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.844378 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-proxy-tls\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.850295 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.850269 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhwf2\" (UniqueName: \"kubernetes.io/projected/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kube-api-access-bhwf2\") pod \"isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:39.997535 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:39.997441 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:14:40.135575 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:40.135547 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w"] Apr 24 19:14:40.137413 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:14:40.137385 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc7a9a4e_23ea_4bc8_a30e_7d1f81a0d82a.slice/crio-61e42e455d6cba635d048ee53f648427526e4ce81b31dfb86b63e8cbcbc3bf87 WatchSource:0}: Error finding container 61e42e455d6cba635d048ee53f648427526e4ce81b31dfb86b63e8cbcbc3bf87: Status 404 returned error can't find the container with id 61e42e455d6cba635d048ee53f648427526e4ce81b31dfb86b63e8cbcbc3bf87 Apr 24 19:14:40.391741 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:40.391706 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerStarted","Data":"61e42e455d6cba635d048ee53f648427526e4ce81b31dfb86b63e8cbcbc3bf87"} Apr 24 19:14:44.406721 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:44.406683 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerStarted","Data":"b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1"} Apr 24 19:14:47.419038 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:47.418998 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerID="b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1" exitCode=0 Apr 24 19:14:47.419431 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:14:47.419074 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerDied","Data":"b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1"} Apr 24 19:15:01.485438 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:01.485374 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerStarted","Data":"596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953"} Apr 24 19:15:03.459697 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.459591 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64575dd578-ptvc5" podUID="1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" containerName="console" containerID="cri-o://3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5" gracePeriod=15 Apr 24 19:15:03.494032 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.493988 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerStarted","Data":"3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1"} Apr 24 19:15:03.713517 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.713454 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64575dd578-ptvc5_1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7/console/0.log" Apr 24 19:15:03.713517 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.713517 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:15:03.892581 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.892534 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-oauth-serving-cert\") pod \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " Apr 24 19:15:03.892775 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.892613 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-config\") pod \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " Apr 24 19:15:03.892775 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.892664 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltpvh\" (UniqueName: \"kubernetes.io/projected/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-kube-api-access-ltpvh\") pod \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " Apr 24 19:15:03.892775 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.892700 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-oauth-config\") pod \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " Apr 24 19:15:03.892775 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.892738 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-service-ca\") pod \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " Apr 24 19:15:03.892982 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.892775 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-serving-cert\") pod \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " Apr 24 19:15:03.892982 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.892808 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-trusted-ca-bundle\") pod \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\" (UID: \"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7\") " Apr 24 19:15:03.893287 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.892967 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" (UID: "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:15:03.893287 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.893227 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" (UID: "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:15:03.893287 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.893153 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-service-ca" (OuterVolumeSpecName: "service-ca") pod "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" (UID: "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:15:03.893287 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.893177 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-config" (OuterVolumeSpecName: "console-config") pod "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" (UID: "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:15:03.893578 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.893434 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:15:03.893578 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.893458 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-service-ca\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:15:03.893578 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.893469 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-trusted-ca-bundle\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:15:03.893578 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.893479 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-oauth-serving-cert\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:15:03.894865 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.894843 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" (UID: "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:15:03.894967 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.894877 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" (UID: "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:15:03.895029 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.894987 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-kube-api-access-ltpvh" (OuterVolumeSpecName: "kube-api-access-ltpvh") pod "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" (UID: "1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7"). InnerVolumeSpecName "kube-api-access-ltpvh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:15:03.994882 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.994790 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltpvh\" (UniqueName: \"kubernetes.io/projected/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-kube-api-access-ltpvh\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:15:03.994882 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.994824 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-oauth-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:15:03.994882 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:03.994834 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7-console-serving-cert\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:15:04.499422 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.499387 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64575dd578-ptvc5_1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7/console/0.log" Apr 24 19:15:04.499861 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.499439 2568 generic.go:358] "Generic (PLEG): container finished" podID="1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" containerID="3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5" exitCode=2 Apr 24 19:15:04.499861 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.499512 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64575dd578-ptvc5" Apr 24 19:15:04.499861 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.499525 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64575dd578-ptvc5" event={"ID":"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7","Type":"ContainerDied","Data":"3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5"} Apr 24 19:15:04.499861 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.499566 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64575dd578-ptvc5" event={"ID":"1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7","Type":"ContainerDied","Data":"41912d178e74739f1eecd2bec449b40e6d9f0163dae445705545f0fe6de1bb30"} Apr 24 19:15:04.499861 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.499586 2568 scope.go:117] "RemoveContainer" containerID="3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5" Apr 24 19:15:04.510928 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.510900 2568 scope.go:117] "RemoveContainer" containerID="3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5" Apr 24 19:15:04.511312 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:15:04.511286 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5\": container with ID starting with 3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5 not found: ID does not exist" containerID="3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5" Apr 24 19:15:04.511419 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.511322 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5"} err="failed to get container status \"3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5\": rpc error: code = NotFound desc = could not find container \"3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5\": container with ID starting with 3289f2bff63420f4f33cd6587d06e265f4ca0923f0da4d7fcdc2e4d7db3d88f5 not found: ID does not exist" Apr 24 19:15:04.524618 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.524583 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64575dd578-ptvc5"] Apr 24 19:15:04.527855 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:04.527829 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64575dd578-ptvc5"] Apr 24 19:15:05.459225 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:05.459184 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" path="/var/lib/kubelet/pods/1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7/volumes" Apr 24 19:15:07.515317 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:07.515281 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerStarted","Data":"bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45"} Apr 24 19:15:07.515806 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:07.515429 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:15:07.544992 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:07.544937 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podStartSLOduration=2.117152912 podStartE2EDuration="28.54491944s" podCreationTimestamp="2026-04-24 19:14:39 +0000 UTC" firstStartedPulling="2026-04-24 19:14:40.139382967 +0000 UTC m=+483.251219294" lastFinishedPulling="2026-04-24 19:15:06.567149487 +0000 UTC m=+509.678985822" observedRunningTime="2026-04-24 19:15:07.542084711 +0000 UTC m=+510.653921058" watchObservedRunningTime="2026-04-24 19:15:07.54491944 +0000 UTC m=+510.656755785" Apr 24 19:15:08.519447 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:08.519377 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:15:08.519447 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:08.519420 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:15:08.521059 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:08.521019 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:15:08.521772 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:08.521745 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:15:08.524554 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:08.524535 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:15:09.523361 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:09.523313 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:15:09.523786 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:09.523660 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:15:10.527556 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:10.527515 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:15:10.527965 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:10.527803 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:15:20.528284 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:20.528239 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:15:20.528754 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:20.528568 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:15:30.528329 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:30.528227 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:15:30.528841 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:30.528619 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:15:40.528437 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:40.528385 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:15:40.528916 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:40.528829 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:15:50.528392 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:50.528335 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:15:50.528807 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:15:50.528777 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:16:00.527563 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:00.527509 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:16:00.528002 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:00.527968 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:16:10.528347 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:10.528310 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:16:10.528812 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:10.528466 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:16:24.697983 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.697946 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w"] Apr 24 19:16:24.698485 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.698427 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" containerID="cri-o://596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953" gracePeriod=30 Apr 24 19:16:24.698720 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.698460 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" containerID="cri-o://3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1" gracePeriod=30 Apr 24 19:16:24.698834 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.698466 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" containerID="cri-o://bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45" gracePeriod=30 Apr 24 19:16:24.765575 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.765541 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k"] Apr 24 19:16:24.765954 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.765941 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" containerName="console" Apr 24 19:16:24.766011 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.765956 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" containerName="console" Apr 24 19:16:24.766045 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.766037 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b5271b5-6b8d-4534-bd7b-cc21e9a9b1b7" containerName="console" Apr 24 19:16:24.769411 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.769388 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:24.772083 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.772060 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-1c30c-predictor-serving-cert\"" Apr 24 19:16:24.772205 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.772113 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\"" Apr 24 19:16:24.779936 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.779905 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k"] Apr 24 19:16:24.816671 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.816554 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c"] Apr 24 19:16:24.820520 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.820498 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:24.823315 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.823291 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-1c30c-predictor-serving-cert\"" Apr 24 19:16:24.823447 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.823327 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\"" Apr 24 19:16:24.827868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.827838 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c"] Apr 24 19:16:24.937126 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.937065 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87de199e-12a0-4a99-a905-77533d44ca3b-isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:24.937330 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.937143 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s277\" (UniqueName: \"kubernetes.io/projected/87de199e-12a0-4a99-a905-77533d44ca3b-kube-api-access-5s277\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:24.937330 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.937185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87de199e-12a0-4a99-a905-77533d44ca3b-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:24.937330 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.937204 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:24.937330 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.937224 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4821ffc-0241-4b44-9207-1e7f3b37cce3-proxy-tls\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:24.937548 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.937330 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87de199e-12a0-4a99-a905-77533d44ca3b-proxy-tls\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:24.937548 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.937373 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgn84\" (UniqueName: \"kubernetes.io/projected/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kube-api-access-tgn84\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:24.937548 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:24.937426 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4821ffc-0241-4b44-9207-1e7f3b37cce3-isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.039059 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.038968 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87de199e-12a0-4a99-a905-77533d44ca3b-isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:25.039059 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039006 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s277\" (UniqueName: \"kubernetes.io/projected/87de199e-12a0-4a99-a905-77533d44ca3b-kube-api-access-5s277\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:25.039335 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039128 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87de199e-12a0-4a99-a905-77533d44ca3b-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:25.039335 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.039335 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4821ffc-0241-4b44-9207-1e7f3b37cce3-proxy-tls\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.039335 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039277 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87de199e-12a0-4a99-a905-77533d44ca3b-proxy-tls\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:25.039335 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039305 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgn84\" (UniqueName: \"kubernetes.io/projected/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kube-api-access-tgn84\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.039594 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4821ffc-0241-4b44-9207-1e7f3b37cce3-isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.039650 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039619 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87de199e-12a0-4a99-a905-77533d44ca3b-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:25.040002 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039858 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87de199e-12a0-4a99-a905-77533d44ca3b-isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:25.040002 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.039926 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.040224 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.040202 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4821ffc-0241-4b44-9207-1e7f3b37cce3-isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.042352 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.042326 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4821ffc-0241-4b44-9207-1e7f3b37cce3-proxy-tls\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.042836 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.042813 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87de199e-12a0-4a99-a905-77533d44ca3b-proxy-tls\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:25.047836 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.047807 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s277\" (UniqueName: \"kubernetes.io/projected/87de199e-12a0-4a99-a905-77533d44ca3b-kube-api-access-5s277\") pod \"isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:25.048369 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.048349 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgn84\" (UniqueName: \"kubernetes.io/projected/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kube-api-access-tgn84\") pod \"isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.081150 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.081089 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:25.133012 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.132974 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:25.218664 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.218638 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k"] Apr 24 19:16:25.220683 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:16:25.220653 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87de199e_12a0_4a99_a905_77533d44ca3b.slice/crio-e6fa3b5a0f44f6ba7d24f0956d2a90f79d582caad6f5aed0b0e0eff70661872c WatchSource:0}: Error finding container e6fa3b5a0f44f6ba7d24f0956d2a90f79d582caad6f5aed0b0e0eff70661872c: Status 404 returned error can't find the container with id e6fa3b5a0f44f6ba7d24f0956d2a90f79d582caad6f5aed0b0e0eff70661872c Apr 24 19:16:25.288352 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.288321 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c"] Apr 24 19:16:25.292168 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:16:25.292137 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4821ffc_0241_4b44_9207_1e7f3b37cce3.slice/crio-65c1205be52320d6db13578a3cf42dafa0203a01880f5ccb430b9a27e6eca95c WatchSource:0}: Error finding container 65c1205be52320d6db13578a3cf42dafa0203a01880f5ccb430b9a27e6eca95c: Status 404 returned error can't find the container with id 65c1205be52320d6db13578a3cf42dafa0203a01880f5ccb430b9a27e6eca95c Apr 24 19:16:25.796180 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.796137 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" event={"ID":"f4821ffc-0241-4b44-9207-1e7f3b37cce3","Type":"ContainerStarted","Data":"b8f3fd15106c77c72092873aa8c0c90b1dd51af862a0008045775a96596fb943"} Apr 24 19:16:25.796180 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.796185 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" event={"ID":"f4821ffc-0241-4b44-9207-1e7f3b37cce3","Type":"ContainerStarted","Data":"65c1205be52320d6db13578a3cf42dafa0203a01880f5ccb430b9a27e6eca95c"} Apr 24 19:16:25.797685 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.797657 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" event={"ID":"87de199e-12a0-4a99-a905-77533d44ca3b","Type":"ContainerStarted","Data":"51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce"} Apr 24 19:16:25.797828 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.797690 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" event={"ID":"87de199e-12a0-4a99-a905-77533d44ca3b","Type":"ContainerStarted","Data":"e6fa3b5a0f44f6ba7d24f0956d2a90f79d582caad6f5aed0b0e0eff70661872c"} Apr 24 19:16:25.799945 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.799916 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerID="3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1" exitCode=2 Apr 24 19:16:25.800084 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:25.799981 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerDied","Data":"3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1"} Apr 24 19:16:28.520213 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:28.520164 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 24 19:16:29.817335 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:29.817296 2568 generic.go:358] "Generic (PLEG): container finished" podID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerID="b8f3fd15106c77c72092873aa8c0c90b1dd51af862a0008045775a96596fb943" exitCode=0 Apr 24 19:16:29.817825 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:29.817373 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" event={"ID":"f4821ffc-0241-4b44-9207-1e7f3b37cce3","Type":"ContainerDied","Data":"b8f3fd15106c77c72092873aa8c0c90b1dd51af862a0008045775a96596fb943"} Apr 24 19:16:29.818756 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:29.818732 2568 generic.go:358] "Generic (PLEG): container finished" podID="87de199e-12a0-4a99-a905-77533d44ca3b" containerID="51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce" exitCode=0 Apr 24 19:16:29.818833 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:29.818811 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" event={"ID":"87de199e-12a0-4a99-a905-77533d44ca3b","Type":"ContainerDied","Data":"51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce"} Apr 24 19:16:29.821115 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:29.821079 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerID="596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953" exitCode=0 Apr 24 19:16:29.821198 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:29.821163 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerDied","Data":"596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953"} Apr 24 19:16:30.528140 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:30.528053 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:16:30.528537 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:30.528500 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:16:30.830077 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:30.829991 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" event={"ID":"87de199e-12a0-4a99-a905-77533d44ca3b","Type":"ContainerStarted","Data":"ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f"} Apr 24 19:16:30.830077 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:30.830041 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" event={"ID":"87de199e-12a0-4a99-a905-77533d44ca3b","Type":"ContainerStarted","Data":"c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b"} Apr 24 19:16:30.830735 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:30.830495 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:30.830735 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:30.830528 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:30.832350 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:30.832315 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:16:30.851606 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:30.851560 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podStartSLOduration=6.851542552 podStartE2EDuration="6.851542552s" podCreationTimestamp="2026-04-24 19:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:16:30.84967484 +0000 UTC m=+593.961511189" watchObservedRunningTime="2026-04-24 19:16:30.851542552 +0000 UTC m=+593.963378902" Apr 24 19:16:31.835225 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:31.835181 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:16:33.519928 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:33.519877 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 24 19:16:36.840822 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:36.840786 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:16:36.841563 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:36.841486 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:16:37.381839 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:37.381808 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:16:37.384586 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:37.384558 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:16:37.386942 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:37.386875 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:16:37.389434 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:37.389414 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:16:38.519766 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:38.519717 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 24 19:16:38.520253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:38.519889 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:16:40.527919 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:40.527875 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:16:40.528409 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:40.528312 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:16:43.519822 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:43.519776 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 24 19:16:46.841845 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:46.841784 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:16:47.899868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:47.899830 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" event={"ID":"f4821ffc-0241-4b44-9207-1e7f3b37cce3","Type":"ContainerStarted","Data":"80737b72386b452129c30ef6e7ea0c8fb94bf25cea6af4efbff45faa48a588f1"} Apr 24 19:16:47.899868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:47.899874 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" event={"ID":"f4821ffc-0241-4b44-9207-1e7f3b37cce3","Type":"ContainerStarted","Data":"075b52bd8e6724e2f24316054720e2f6963bb6f670d70818f0653d3135f36d8d"} Apr 24 19:16:47.900367 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:47.900082 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:47.920748 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:47.920684 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podStartSLOduration=6.521457519 podStartE2EDuration="23.920662756s" podCreationTimestamp="2026-04-24 19:16:24 +0000 UTC" firstStartedPulling="2026-04-24 19:16:29.818874143 +0000 UTC m=+592.930710470" lastFinishedPulling="2026-04-24 19:16:47.218079383 +0000 UTC m=+610.329915707" observedRunningTime="2026-04-24 19:16:47.919554968 +0000 UTC m=+611.031391314" watchObservedRunningTime="2026-04-24 19:16:47.920662756 +0000 UTC m=+611.032499109" Apr 24 19:16:48.519893 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:48.519846 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 24 19:16:48.903660 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:48.903626 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:48.904924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:48.904884 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:16:49.907297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:49.907255 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:16:50.527918 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:50.527868 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 19:16:50.528156 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:50.528014 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:16:50.528237 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:50.528215 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:16:50.528317 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:50.528304 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:16:53.520693 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:53.520592 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 24 19:16:54.911614 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:54.911584 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:16:54.912246 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:54.912217 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:16:55.369349 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.369276 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:16:55.431631 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.431593 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-proxy-tls\") pod \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " Apr 24 19:16:55.431780 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.431666 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhwf2\" (UniqueName: \"kubernetes.io/projected/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kube-api-access-bhwf2\") pod \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " Apr 24 19:16:55.431780 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.431717 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kserve-provision-location\") pod \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " Apr 24 19:16:55.431780 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.431771 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\") pod \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\" (UID: \"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a\") " Apr 24 19:16:55.432144 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.432089 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" (UID: "bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:16:55.432216 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.432181 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config") pod "bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" (UID: "bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:16:55.434024 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.433994 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" (UID: "bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:16:55.434221 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.433994 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kube-api-access-bhwf2" (OuterVolumeSpecName: "kube-api-access-bhwf2") pod "bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" (UID: "bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a"). InnerVolumeSpecName "kube-api-access-bhwf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:16:55.532651 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.532608 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:16:55.532651 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.532647 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhwf2\" (UniqueName: \"kubernetes.io/projected/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kube-api-access-bhwf2\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:16:55.532651 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.532657 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:16:55.532894 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.532670 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a-isvc-raw-sklearn-batcher-56c17-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:16:55.931419 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.931379 2568 generic.go:358] "Generic (PLEG): container finished" podID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerID="bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45" exitCode=0 Apr 24 19:16:55.931898 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.931425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerDied","Data":"bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45"} Apr 24 19:16:55.931898 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.931453 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" event={"ID":"bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a","Type":"ContainerDied","Data":"61e42e455d6cba635d048ee53f648427526e4ce81b31dfb86b63e8cbcbc3bf87"} Apr 24 19:16:55.931898 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.931470 2568 scope.go:117] "RemoveContainer" containerID="bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45" Apr 24 19:16:55.931898 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.931484 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w" Apr 24 19:16:55.943235 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.943206 2568 scope.go:117] "RemoveContainer" containerID="3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1" Apr 24 19:16:55.951060 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.951030 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w"] Apr 24 19:16:55.953281 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.953255 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-56c17-predictor-6548fdf654-2mx4w"] Apr 24 19:16:55.953742 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.953726 2568 scope.go:117] "RemoveContainer" containerID="596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953" Apr 24 19:16:55.961704 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.961684 2568 scope.go:117] "RemoveContainer" containerID="b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1" Apr 24 19:16:55.969392 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.969366 2568 scope.go:117] "RemoveContainer" containerID="bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45" Apr 24 19:16:55.969672 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:16:55.969652 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45\": container with ID starting with bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45 not found: ID does not exist" containerID="bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45" Apr 24 19:16:55.969728 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.969682 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45"} err="failed to get container status \"bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45\": rpc error: code = NotFound desc = could not find container \"bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45\": container with ID starting with bd8bd3aa9e7ca7705e1c153ef1b9cf22f5dcbd7e89f41151deefc4fa53566a45 not found: ID does not exist" Apr 24 19:16:55.969728 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.969701 2568 scope.go:117] "RemoveContainer" containerID="3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1" Apr 24 19:16:55.969945 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:16:55.969927 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1\": container with ID starting with 3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1 not found: ID does not exist" containerID="3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1" Apr 24 19:16:55.969988 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.969952 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1"} err="failed to get container status \"3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1\": rpc error: code = NotFound desc = could not find container \"3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1\": container with ID starting with 3e548f665736a2e9a11ec322f7309c25478a681707badf834af87bee7f8c02d1 not found: ID does not exist" Apr 24 19:16:55.969988 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.969971 2568 scope.go:117] "RemoveContainer" containerID="596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953" Apr 24 19:16:55.970203 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:16:55.970188 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953\": container with ID starting with 596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953 not found: ID does not exist" containerID="596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953" Apr 24 19:16:55.970268 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.970206 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953"} err="failed to get container status \"596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953\": rpc error: code = NotFound desc = could not find container \"596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953\": container with ID starting with 596224e8ed00e0f225b55c35d0a6a4301c7336acd76e840b19ea028ea674c953 not found: ID does not exist" Apr 24 19:16:55.970268 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.970221 2568 scope.go:117] "RemoveContainer" containerID="b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1" Apr 24 19:16:55.970423 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:16:55.970406 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1\": container with ID starting with b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1 not found: ID does not exist" containerID="b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1" Apr 24 19:16:55.970464 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:55.970429 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1"} err="failed to get container status \"b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1\": rpc error: code = NotFound desc = could not find container \"b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1\": container with ID starting with b3c054a827b3ce3c043a22f657bc43d283b3602bebc60aef5b97043623494db1 not found: ID does not exist" Apr 24 19:16:56.841494 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:56.841448 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:16:57.460692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:16:57.460653 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" path="/var/lib/kubelet/pods/bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a/volumes" Apr 24 19:17:04.912806 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:04.912767 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:17:06.841516 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:06.841461 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:17:14.913032 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:14.912992 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:17:16.841385 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:16.841336 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:17:24.912400 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:24.912351 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:17:26.842169 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:26.842117 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:17:34.912537 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:34.912495 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:17:36.841738 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:36.841693 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:17:44.912542 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:44.912501 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:17:46.842946 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:46.842912 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:17:54.913334 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:17:54.913301 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:18:04.803160 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803114 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7"] Apr 24 19:18:04.803614 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803542 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" Apr 24 19:18:04.803614 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803554 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" Apr 24 19:18:04.803614 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803564 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" Apr 24 19:18:04.803614 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803573 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" Apr 24 19:18:04.803614 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803585 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="storage-initializer" Apr 24 19:18:04.803614 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803594 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="storage-initializer" Apr 24 19:18:04.803800 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803626 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" Apr 24 19:18:04.803800 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803641 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" Apr 24 19:18:04.803800 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803719 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kube-rbac-proxy" Apr 24 19:18:04.803800 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803733 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="agent" Apr 24 19:18:04.803800 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.803743 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc7a9a4e-23ea-4bc8-a30e-7d1f81a0d82a" containerName="kserve-container" Apr 24 19:18:04.806959 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.806935 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:04.809562 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.809538 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-1c30c-serving-cert\"" Apr 24 19:18:04.809683 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.809564 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-1c30c-kube-rbac-proxy-sar-config\"" Apr 24 19:18:04.813632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.813607 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7"] Apr 24 19:18:04.960245 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.960205 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/696cf363-91e9-4d72-9ce7-9f97ba73d285-openshift-service-ca-bundle\") pod \"model-chainer-raw-1c30c-5c79844df6-j74n7\" (UID: \"696cf363-91e9-4d72-9ce7-9f97ba73d285\") " pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:04.960245 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:04.960249 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696cf363-91e9-4d72-9ce7-9f97ba73d285-proxy-tls\") pod \"model-chainer-raw-1c30c-5c79844df6-j74n7\" (UID: \"696cf363-91e9-4d72-9ce7-9f97ba73d285\") " pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:05.060923 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:05.060833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/696cf363-91e9-4d72-9ce7-9f97ba73d285-openshift-service-ca-bundle\") pod \"model-chainer-raw-1c30c-5c79844df6-j74n7\" (UID: \"696cf363-91e9-4d72-9ce7-9f97ba73d285\") " pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:05.060923 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:05.060879 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696cf363-91e9-4d72-9ce7-9f97ba73d285-proxy-tls\") pod \"model-chainer-raw-1c30c-5c79844df6-j74n7\" (UID: \"696cf363-91e9-4d72-9ce7-9f97ba73d285\") " pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:05.061127 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:18:05.060998 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-1c30c-serving-cert: secret "model-chainer-raw-1c30c-serving-cert" not found Apr 24 19:18:05.061127 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:18:05.061087 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/696cf363-91e9-4d72-9ce7-9f97ba73d285-proxy-tls podName:696cf363-91e9-4d72-9ce7-9f97ba73d285 nodeName:}" failed. No retries permitted until 2026-04-24 19:18:05.561068972 +0000 UTC m=+688.672905296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/696cf363-91e9-4d72-9ce7-9f97ba73d285-proxy-tls") pod "model-chainer-raw-1c30c-5c79844df6-j74n7" (UID: "696cf363-91e9-4d72-9ce7-9f97ba73d285") : secret "model-chainer-raw-1c30c-serving-cert" not found Apr 24 19:18:05.061554 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:05.061533 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/696cf363-91e9-4d72-9ce7-9f97ba73d285-openshift-service-ca-bundle\") pod \"model-chainer-raw-1c30c-5c79844df6-j74n7\" (UID: \"696cf363-91e9-4d72-9ce7-9f97ba73d285\") " pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:05.570070 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:05.570023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696cf363-91e9-4d72-9ce7-9f97ba73d285-proxy-tls\") pod \"model-chainer-raw-1c30c-5c79844df6-j74n7\" (UID: \"696cf363-91e9-4d72-9ce7-9f97ba73d285\") " pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:05.572579 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:05.572553 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696cf363-91e9-4d72-9ce7-9f97ba73d285-proxy-tls\") pod \"model-chainer-raw-1c30c-5c79844df6-j74n7\" (UID: \"696cf363-91e9-4d72-9ce7-9f97ba73d285\") " pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:05.719375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:05.719335 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:05.843072 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:05.843045 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7"] Apr 24 19:18:05.845778 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:18:05.845747 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod696cf363_91e9_4d72_9ce7_9f97ba73d285.slice/crio-4c8d98c28e0f289ed13c1041a79207f403c608669682447c64af6463c173134b WatchSource:0}: Error finding container 4c8d98c28e0f289ed13c1041a79207f403c608669682447c64af6463c173134b: Status 404 returned error can't find the container with id 4c8d98c28e0f289ed13c1041a79207f403c608669682447c64af6463c173134b Apr 24 19:18:05.847561 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:05.847539 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:18:06.193528 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:06.193484 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" event={"ID":"696cf363-91e9-4d72-9ce7-9f97ba73d285","Type":"ContainerStarted","Data":"4c8d98c28e0f289ed13c1041a79207f403c608669682447c64af6463c173134b"} Apr 24 19:18:08.202517 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:08.202425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" event={"ID":"696cf363-91e9-4d72-9ce7-9f97ba73d285","Type":"ContainerStarted","Data":"344d30c8e77aa4460e91033d0e50447bb33c83261440678573c34838cb58d3da"} Apr 24 19:18:08.202899 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:08.202587 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:08.221581 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:08.221528 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" podStartSLOduration=2.197797891 podStartE2EDuration="4.221513079s" podCreationTimestamp="2026-04-24 19:18:04 +0000 UTC" firstStartedPulling="2026-04-24 19:18:05.847664898 +0000 UTC m=+688.959501222" lastFinishedPulling="2026-04-24 19:18:07.871380085 +0000 UTC m=+690.983216410" observedRunningTime="2026-04-24 19:18:08.219759913 +0000 UTC m=+691.331596260" watchObservedRunningTime="2026-04-24 19:18:08.221513079 +0000 UTC m=+691.333349475" Apr 24 19:18:14.213737 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:14.213707 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:14.868480 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:14.868443 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7"] Apr 24 19:18:14.868713 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:14.868667 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" containerID="cri-o://344d30c8e77aa4460e91033d0e50447bb33c83261440678573c34838cb58d3da" gracePeriod=30 Apr 24 19:18:15.013658 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.013625 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k"] Apr 24 19:18:15.013980 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.013954 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" containerID="cri-o://c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b" gracePeriod=30 Apr 24 19:18:15.014068 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.014011 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kube-rbac-proxy" containerID="cri-o://ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f" gracePeriod=30 Apr 24 19:18:15.077254 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.077214 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f"] Apr 24 19:18:15.081817 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.081788 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.084436 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.084407 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-1911c-predictor-serving-cert\"" Apr 24 19:18:15.084614 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.084590 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\"" Apr 24 19:18:15.092085 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.092032 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f"] Apr 24 19:18:15.122074 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.121987 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c"] Apr 24 19:18:15.122353 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.122323 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" containerID="cri-o://075b52bd8e6724e2f24316054720e2f6963bb6f670d70818f0653d3135f36d8d" gracePeriod=30 Apr 24 19:18:15.122497 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.122361 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kube-rbac-proxy" containerID="cri-o://80737b72386b452129c30ef6e7ea0c8fb94bf25cea6af4efbff45faa48a588f1" gracePeriod=30 Apr 24 19:18:15.153958 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.153916 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.154189 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.153983 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjxks\" (UniqueName: \"kubernetes.io/projected/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kube-api-access-jjxks\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.154189 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.154076 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.154292 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.154217 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.190656 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.190617 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76"] Apr 24 19:18:15.194572 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.194554 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.197230 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.197209 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-1911c-predictor-serving-cert\"" Apr 24 19:18:15.197324 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.197207 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\"" Apr 24 19:18:15.201669 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.201641 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76"] Apr 24 19:18:15.231395 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.231364 2568 generic.go:358] "Generic (PLEG): container finished" podID="87de199e-12a0-4a99-a905-77533d44ca3b" containerID="ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f" exitCode=2 Apr 24 19:18:15.231722 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.231408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" event={"ID":"87de199e-12a0-4a99-a905-77533d44ca3b","Type":"ContainerDied","Data":"ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f"} Apr 24 19:18:15.255465 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.255432 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.255642 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.255494 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73108d1e-c23f-4757-b27f-857d1b472a6c-isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.255642 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.255558 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.255790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.255660 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-428l4\" (UniqueName: \"kubernetes.io/projected/73108d1e-c23f-4757-b27f-857d1b472a6c-kube-api-access-428l4\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.255790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.255739 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.255897 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.255795 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73108d1e-c23f-4757-b27f-857d1b472a6c-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.255897 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.255819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjxks\" (UniqueName: \"kubernetes.io/projected/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kube-api-access-jjxks\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.255897 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.255841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73108d1e-c23f-4757-b27f-857d1b472a6c-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.256150 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.256123 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.256309 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.256288 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.258292 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.258269 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.263984 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.263960 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjxks\" (UniqueName: \"kubernetes.io/projected/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kube-api-access-jjxks\") pod \"isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.357344 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.357302 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73108d1e-c23f-4757-b27f-857d1b472a6c-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.357344 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.357353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73108d1e-c23f-4757-b27f-857d1b472a6c-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.357611 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.357385 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73108d1e-c23f-4757-b27f-857d1b472a6c-isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.357611 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.357432 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-428l4\" (UniqueName: \"kubernetes.io/projected/73108d1e-c23f-4757-b27f-857d1b472a6c-kube-api-access-428l4\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.357611 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:18:15.357545 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-serving-cert: secret "isvc-xgboost-graph-raw-hpa-1911c-predictor-serving-cert" not found Apr 24 19:18:15.357787 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:18:15.357621 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73108d1e-c23f-4757-b27f-857d1b472a6c-proxy-tls podName:73108d1e-c23f-4757-b27f-857d1b472a6c nodeName:}" failed. No retries permitted until 2026-04-24 19:18:15.85759954 +0000 UTC m=+698.969435864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/73108d1e-c23f-4757-b27f-857d1b472a6c-proxy-tls") pod "isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" (UID: "73108d1e-c23f-4757-b27f-857d1b472a6c") : secret "isvc-xgboost-graph-raw-hpa-1911c-predictor-serving-cert" not found Apr 24 19:18:15.357842 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.357780 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73108d1e-c23f-4757-b27f-857d1b472a6c-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.358096 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.358075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73108d1e-c23f-4757-b27f-857d1b472a6c-isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.366456 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.366432 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-428l4\" (UniqueName: \"kubernetes.io/projected/73108d1e-c23f-4757-b27f-857d1b472a6c-kube-api-access-428l4\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.397396 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.397285 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:15.530040 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.530011 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f"] Apr 24 19:18:15.532647 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:18:15.532621 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ca8ad97_5405_4afe_8fc6_c4a1f0c545c9.slice/crio-93f3e59497dfc2f85b0824af1ca0bc9f404d1d90d2ffee6c05655ff4d3d805e1 WatchSource:0}: Error finding container 93f3e59497dfc2f85b0824af1ca0bc9f404d1d90d2ffee6c05655ff4d3d805e1: Status 404 returned error can't find the container with id 93f3e59497dfc2f85b0824af1ca0bc9f404d1d90d2ffee6c05655ff4d3d805e1 Apr 24 19:18:15.861285 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.861248 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73108d1e-c23f-4757-b27f-857d1b472a6c-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:15.863684 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:15.863662 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73108d1e-c23f-4757-b27f-857d1b472a6c-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:16.106540 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:16.106499 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:16.243914 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:16.242904 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76"] Apr 24 19:18:16.251518 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:18:16.251485 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73108d1e_c23f_4757_b27f_857d1b472a6c.slice/crio-5a99868d139b1f435f8d8df38921c2b3fe3a141358af6294fe338aa1a925636b WatchSource:0}: Error finding container 5a99868d139b1f435f8d8df38921c2b3fe3a141358af6294fe338aa1a925636b: Status 404 returned error can't find the container with id 5a99868d139b1f435f8d8df38921c2b3fe3a141358af6294fe338aa1a925636b Apr 24 19:18:16.251653 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:16.251534 2568 generic.go:358] "Generic (PLEG): container finished" podID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerID="80737b72386b452129c30ef6e7ea0c8fb94bf25cea6af4efbff45faa48a588f1" exitCode=2 Apr 24 19:18:16.251653 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:16.251608 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" event={"ID":"f4821ffc-0241-4b44-9207-1e7f3b37cce3","Type":"ContainerDied","Data":"80737b72386b452129c30ef6e7ea0c8fb94bf25cea6af4efbff45faa48a588f1"} Apr 24 19:18:16.253366 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:16.253343 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" event={"ID":"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9","Type":"ContainerStarted","Data":"48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0"} Apr 24 19:18:16.253472 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:16.253371 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" event={"ID":"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9","Type":"ContainerStarted","Data":"93f3e59497dfc2f85b0824af1ca0bc9f404d1d90d2ffee6c05655ff4d3d805e1"} Apr 24 19:18:16.835656 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:16.835610 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.39:8643/healthz\": dial tcp 10.132.0.39:8643: connect: connection refused" Apr 24 19:18:16.842194 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:16.842152 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 24 19:18:17.258632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:17.258595 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" event={"ID":"73108d1e-c23f-4757-b27f-857d1b472a6c","Type":"ContainerStarted","Data":"91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a"} Apr 24 19:18:17.258632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:17.258633 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" event={"ID":"73108d1e-c23f-4757-b27f-857d1b472a6c","Type":"ContainerStarted","Data":"5a99868d139b1f435f8d8df38921c2b3fe3a141358af6294fe338aa1a925636b"} Apr 24 19:18:19.216270 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.216223 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:19.268986 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.268948 2568 generic.go:358] "Generic (PLEG): container finished" podID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerID="075b52bd8e6724e2f24316054720e2f6963bb6f670d70818f0653d3135f36d8d" exitCode=0 Apr 24 19:18:19.269157 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.269012 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" event={"ID":"f4821ffc-0241-4b44-9207-1e7f3b37cce3","Type":"ContainerDied","Data":"075b52bd8e6724e2f24316054720e2f6963bb6f670d70818f0653d3135f36d8d"} Apr 24 19:18:19.269157 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.269054 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" event={"ID":"f4821ffc-0241-4b44-9207-1e7f3b37cce3","Type":"ContainerDied","Data":"65c1205be52320d6db13578a3cf42dafa0203a01880f5ccb430b9a27e6eca95c"} Apr 24 19:18:19.269157 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.269069 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65c1205be52320d6db13578a3cf42dafa0203a01880f5ccb430b9a27e6eca95c" Apr 24 19:18:19.337139 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.337038 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:18:19.392020 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.391985 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4821ffc-0241-4b44-9207-1e7f3b37cce3-proxy-tls\") pod \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " Apr 24 19:18:19.392189 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.392031 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kserve-provision-location\") pod \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " Apr 24 19:18:19.392189 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.392054 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgn84\" (UniqueName: \"kubernetes.io/projected/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kube-api-access-tgn84\") pod \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " Apr 24 19:18:19.392189 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.392178 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4821ffc-0241-4b44-9207-1e7f3b37cce3-isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\") pod \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\" (UID: \"f4821ffc-0241-4b44-9207-1e7f3b37cce3\") " Apr 24 19:18:19.392439 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.392409 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f4821ffc-0241-4b44-9207-1e7f3b37cce3" (UID: "f4821ffc-0241-4b44-9207-1e7f3b37cce3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:18:19.392500 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.392482 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4821ffc-0241-4b44-9207-1e7f3b37cce3-isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config") pod "f4821ffc-0241-4b44-9207-1e7f3b37cce3" (UID: "f4821ffc-0241-4b44-9207-1e7f3b37cce3"). InnerVolumeSpecName "isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:18:19.394212 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.394187 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kube-api-access-tgn84" (OuterVolumeSpecName: "kube-api-access-tgn84") pod "f4821ffc-0241-4b44-9207-1e7f3b37cce3" (UID: "f4821ffc-0241-4b44-9207-1e7f3b37cce3"). InnerVolumeSpecName "kube-api-access-tgn84". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:18:19.394212 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.394207 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4821ffc-0241-4b44-9207-1e7f3b37cce3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f4821ffc-0241-4b44-9207-1e7f3b37cce3" (UID: "f4821ffc-0241-4b44-9207-1e7f3b37cce3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:18:19.493859 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.493819 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4821ffc-0241-4b44-9207-1e7f3b37cce3-isvc-xgboost-graph-raw-1c30c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:19.493859 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.493858 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4821ffc-0241-4b44-9207-1e7f3b37cce3-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:19.494052 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.493874 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:19.494052 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.493887 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tgn84\" (UniqueName: \"kubernetes.io/projected/f4821ffc-0241-4b44-9207-1e7f3b37cce3-kube-api-access-tgn84\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:19.862484 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.862461 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:18:19.896576 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.896495 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87de199e-12a0-4a99-a905-77533d44ca3b-kserve-provision-location\") pod \"87de199e-12a0-4a99-a905-77533d44ca3b\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " Apr 24 19:18:19.896718 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.896583 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s277\" (UniqueName: \"kubernetes.io/projected/87de199e-12a0-4a99-a905-77533d44ca3b-kube-api-access-5s277\") pod \"87de199e-12a0-4a99-a905-77533d44ca3b\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " Apr 24 19:18:19.896718 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.896615 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87de199e-12a0-4a99-a905-77533d44ca3b-isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\") pod \"87de199e-12a0-4a99-a905-77533d44ca3b\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " Apr 24 19:18:19.896718 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.896641 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87de199e-12a0-4a99-a905-77533d44ca3b-proxy-tls\") pod \"87de199e-12a0-4a99-a905-77533d44ca3b\" (UID: \"87de199e-12a0-4a99-a905-77533d44ca3b\") " Apr 24 19:18:19.896901 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.896874 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87de199e-12a0-4a99-a905-77533d44ca3b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87de199e-12a0-4a99-a905-77533d44ca3b" (UID: "87de199e-12a0-4a99-a905-77533d44ca3b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:18:19.897004 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.896979 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87de199e-12a0-4a99-a905-77533d44ca3b-isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config") pod "87de199e-12a0-4a99-a905-77533d44ca3b" (UID: "87de199e-12a0-4a99-a905-77533d44ca3b"). InnerVolumeSpecName "isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:18:19.898779 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.898745 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87de199e-12a0-4a99-a905-77533d44ca3b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "87de199e-12a0-4a99-a905-77533d44ca3b" (UID: "87de199e-12a0-4a99-a905-77533d44ca3b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:18:19.898876 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.898786 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87de199e-12a0-4a99-a905-77533d44ca3b-kube-api-access-5s277" (OuterVolumeSpecName: "kube-api-access-5s277") pod "87de199e-12a0-4a99-a905-77533d44ca3b" (UID: "87de199e-12a0-4a99-a905-77533d44ca3b"). InnerVolumeSpecName "kube-api-access-5s277". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:18:19.997580 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.997543 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87de199e-12a0-4a99-a905-77533d44ca3b-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:19.997580 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.997573 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5s277\" (UniqueName: \"kubernetes.io/projected/87de199e-12a0-4a99-a905-77533d44ca3b-kube-api-access-5s277\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:19.997580 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.997585 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87de199e-12a0-4a99-a905-77533d44ca3b-isvc-sklearn-graph-raw-1c30c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:19.997829 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:19.997595 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87de199e-12a0-4a99-a905-77533d44ca3b-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:20.274719 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.274630 2568 generic.go:358] "Generic (PLEG): container finished" podID="87de199e-12a0-4a99-a905-77533d44ca3b" containerID="c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b" exitCode=0 Apr 24 19:18:20.275173 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.274709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" event={"ID":"87de199e-12a0-4a99-a905-77533d44ca3b","Type":"ContainerDied","Data":"c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b"} Apr 24 19:18:20.275173 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.274761 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" event={"ID":"87de199e-12a0-4a99-a905-77533d44ca3b","Type":"ContainerDied","Data":"e6fa3b5a0f44f6ba7d24f0956d2a90f79d582caad6f5aed0b0e0eff70661872c"} Apr 24 19:18:20.275173 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.274784 2568 scope.go:117] "RemoveContainer" containerID="ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f" Apr 24 19:18:20.275173 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.274720 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k" Apr 24 19:18:20.276246 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.276221 2568 generic.go:358] "Generic (PLEG): container finished" podID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerID="48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0" exitCode=0 Apr 24 19:18:20.276362 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.276294 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" event={"ID":"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9","Type":"ContainerDied","Data":"48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0"} Apr 24 19:18:20.276487 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.276470 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c" Apr 24 19:18:20.284646 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.284619 2568 scope.go:117] "RemoveContainer" containerID="c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b" Apr 24 19:18:20.294253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.294155 2568 scope.go:117] "RemoveContainer" containerID="51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce" Apr 24 19:18:20.303836 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.303813 2568 scope.go:117] "RemoveContainer" containerID="ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f" Apr 24 19:18:20.304224 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:18:20.304192 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f\": container with ID starting with ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f not found: ID does not exist" containerID="ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f" Apr 24 19:18:20.304347 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.304231 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f"} err="failed to get container status \"ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f\": rpc error: code = NotFound desc = could not find container \"ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f\": container with ID starting with ffa0a906b4299b86df55ad73a8285e57e0515d43dce199590ecef89d4c37910f not found: ID does not exist" Apr 24 19:18:20.304347 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.304257 2568 scope.go:117] "RemoveContainer" containerID="c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b" Apr 24 19:18:20.304537 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:18:20.304512 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b\": container with ID starting with c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b not found: ID does not exist" containerID="c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b" Apr 24 19:18:20.304620 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.304546 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b"} err="failed to get container status \"c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b\": rpc error: code = NotFound desc = could not find container \"c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b\": container with ID starting with c33df78c810215f44af5ca22a2283b8e24e9f2d49c02f5865668cde0a0cc130b not found: ID does not exist" Apr 24 19:18:20.304620 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.304564 2568 scope.go:117] "RemoveContainer" containerID="51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce" Apr 24 19:18:20.304803 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:18:20.304779 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce\": container with ID starting with 51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce not found: ID does not exist" containerID="51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce" Apr 24 19:18:20.305007 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.304818 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce"} err="failed to get container status \"51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce\": rpc error: code = NotFound desc = could not find container \"51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce\": container with ID starting with 51ecb257b820f5293348ad474524a903f03c1080230717a4d68e432d4731b7ce not found: ID does not exist" Apr 24 19:18:20.318342 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.318311 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k"] Apr 24 19:18:20.324405 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.324377 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1c30c-predictor-bcf44fbc8-x9j4k"] Apr 24 19:18:20.336046 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.336021 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c"] Apr 24 19:18:20.342253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:20.342228 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1c30c-predictor-6cb5798b77-k5p6c"] Apr 24 19:18:21.283454 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:21.283418 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" event={"ID":"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9","Type":"ContainerStarted","Data":"cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794"} Apr 24 19:18:21.283454 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:21.283458 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" event={"ID":"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9","Type":"ContainerStarted","Data":"37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990"} Apr 24 19:18:21.283963 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:21.283671 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:21.284721 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:21.284697 2568 generic.go:358] "Generic (PLEG): container finished" podID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerID="91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a" exitCode=0 Apr 24 19:18:21.284816 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:21.284742 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" event={"ID":"73108d1e-c23f-4757-b27f-857d1b472a6c","Type":"ContainerDied","Data":"91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a"} Apr 24 19:18:21.308056 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:21.308008 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podStartSLOduration=6.307994079 podStartE2EDuration="6.307994079s" podCreationTimestamp="2026-04-24 19:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:18:21.305618338 +0000 UTC m=+704.417454695" watchObservedRunningTime="2026-04-24 19:18:21.307994079 +0000 UTC m=+704.419830424" Apr 24 19:18:21.461134 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:21.461079 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" path="/var/lib/kubelet/pods/87de199e-12a0-4a99-a905-77533d44ca3b/volumes" Apr 24 19:18:21.461609 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:21.461594 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" path="/var/lib/kubelet/pods/f4821ffc-0241-4b44-9207-1e7f3b37cce3/volumes" Apr 24 19:18:22.290514 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:22.290473 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" event={"ID":"73108d1e-c23f-4757-b27f-857d1b472a6c","Type":"ContainerStarted","Data":"c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac"} Apr 24 19:18:22.290939 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:22.290521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" event={"ID":"73108d1e-c23f-4757-b27f-857d1b472a6c","Type":"ContainerStarted","Data":"74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a"} Apr 24 19:18:22.290939 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:22.290738 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:22.291046 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:22.291009 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:22.291152 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:22.291136 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:22.292059 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:22.292034 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:18:22.292183 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:22.292059 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:18:22.310686 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:22.310640 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podStartSLOduration=7.310626883 podStartE2EDuration="7.310626883s" podCreationTimestamp="2026-04-24 19:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:18:22.31005438 +0000 UTC m=+705.421890726" watchObservedRunningTime="2026-04-24 19:18:22.310626883 +0000 UTC m=+705.422463228" Apr 24 19:18:23.295177 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:23.295064 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:18:23.295177 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:23.295131 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:18:24.211009 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:24.210962 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:28.302025 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:28.301995 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:18:28.302500 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:28.302073 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:18:28.302576 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:28.302544 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:18:28.302636 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:28.302565 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:18:29.210219 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:29.210179 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:29.210411 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:29.210285 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:34.211453 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:34.211412 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:38.302956 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:38.302914 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:18:38.303402 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:38.302914 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:18:39.210512 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:39.210472 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:44.210828 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:44.210780 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:44.902794 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:18:44.902756 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod696cf363_91e9_4d72_9ce7_9f97ba73d285.slice/crio-conmon-344d30c8e77aa4460e91033d0e50447bb33c83261440678573c34838cb58d3da.scope\": RecentStats: unable to find data in memory cache]" Apr 24 19:18:44.902962 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:18:44.902808 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod696cf363_91e9_4d72_9ce7_9f97ba73d285.slice/crio-conmon-344d30c8e77aa4460e91033d0e50447bb33c83261440678573c34838cb58d3da.scope\": RecentStats: unable to find data in memory cache]" Apr 24 19:18:45.384791 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:45.384756 2568 generic.go:358] "Generic (PLEG): container finished" podID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerID="344d30c8e77aa4460e91033d0e50447bb33c83261440678573c34838cb58d3da" exitCode=0 Apr 24 19:18:45.385179 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:45.384827 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" event={"ID":"696cf363-91e9-4d72-9ce7-9f97ba73d285","Type":"ContainerDied","Data":"344d30c8e77aa4460e91033d0e50447bb33c83261440678573c34838cb58d3da"} Apr 24 19:18:45.519383 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:45.519354 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:45.633495 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:45.633439 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/696cf363-91e9-4d72-9ce7-9f97ba73d285-openshift-service-ca-bundle\") pod \"696cf363-91e9-4d72-9ce7-9f97ba73d285\" (UID: \"696cf363-91e9-4d72-9ce7-9f97ba73d285\") " Apr 24 19:18:45.633703 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:45.633576 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696cf363-91e9-4d72-9ce7-9f97ba73d285-proxy-tls\") pod \"696cf363-91e9-4d72-9ce7-9f97ba73d285\" (UID: \"696cf363-91e9-4d72-9ce7-9f97ba73d285\") " Apr 24 19:18:45.633880 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:45.633853 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/696cf363-91e9-4d72-9ce7-9f97ba73d285-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "696cf363-91e9-4d72-9ce7-9f97ba73d285" (UID: "696cf363-91e9-4d72-9ce7-9f97ba73d285"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:18:45.635623 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:45.635571 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696cf363-91e9-4d72-9ce7-9f97ba73d285-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "696cf363-91e9-4d72-9ce7-9f97ba73d285" (UID: "696cf363-91e9-4d72-9ce7-9f97ba73d285"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:18:45.734355 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:45.734319 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696cf363-91e9-4d72-9ce7-9f97ba73d285-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:45.734355 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:45.734355 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/696cf363-91e9-4d72-9ce7-9f97ba73d285-openshift-service-ca-bundle\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:18:46.389592 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:46.389559 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" Apr 24 19:18:46.390018 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:46.389593 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7" event={"ID":"696cf363-91e9-4d72-9ce7-9f97ba73d285","Type":"ContainerDied","Data":"4c8d98c28e0f289ed13c1041a79207f403c608669682447c64af6463c173134b"} Apr 24 19:18:46.390018 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:46.389639 2568 scope.go:117] "RemoveContainer" containerID="344d30c8e77aa4460e91033d0e50447bb33c83261440678573c34838cb58d3da" Apr 24 19:18:46.410699 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:46.410668 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7"] Apr 24 19:18:46.413879 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:46.413854 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1c30c-5c79844df6-j74n7"] Apr 24 19:18:47.465114 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:47.465064 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" path="/var/lib/kubelet/pods/696cf363-91e9-4d72-9ce7-9f97ba73d285/volumes" Apr 24 19:18:48.302822 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:48.302778 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:18:48.303007 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:48.302777 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:18:58.303256 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:58.303203 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:18:58.303654 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:18:58.303202 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:19:08.302511 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:08.302468 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:19:08.303053 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:08.302464 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:19:18.302474 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:18.302429 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:19:18.302928 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:18.302582 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:19:28.303278 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:28.303245 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:19:28.303785 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:28.303325 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:19:55.300329 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.300239 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f"] Apr 24 19:19:55.300883 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.300609 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" containerID="cri-o://37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990" gracePeriod=30 Apr 24 19:19:55.300883 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.300644 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kube-rbac-proxy" containerID="cri-o://cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794" gracePeriod=30 Apr 24 19:19:55.333090 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333035 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv"] Apr 24 19:19:55.333549 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333515 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="storage-initializer" Apr 24 19:19:55.333655 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333568 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="storage-initializer" Apr 24 19:19:55.333655 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333582 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" Apr 24 19:19:55.333655 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333591 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" Apr 24 19:19:55.333655 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333603 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kube-rbac-proxy" Apr 24 19:19:55.333655 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333615 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kube-rbac-proxy" Apr 24 19:19:55.333655 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333650 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kube-rbac-proxy" Apr 24 19:19:55.333655 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333655 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kube-rbac-proxy" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333662 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333667 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333684 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="storage-initializer" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333689 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="storage-initializer" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333700 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333706 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333775 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="696cf363-91e9-4d72-9ce7-9f97ba73d285" containerName="model-chainer-raw-1c30c" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333786 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kserve-container" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333792 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="87de199e-12a0-4a99-a905-77533d44ca3b" containerName="kube-rbac-proxy" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333802 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kube-rbac-proxy" Apr 24 19:19:55.333924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.333810 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4821ffc-0241-4b44-9207-1e7f3b37cce3" containerName="kserve-container" Apr 24 19:19:55.338713 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.338693 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.341621 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.341591 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-2ea81-predictor-serving-cert\"" Apr 24 19:19:55.341621 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.341613 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\"" Apr 24 19:19:55.346649 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.346617 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv"] Apr 24 19:19:55.367867 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.367598 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-proxy-tls\") pod \"message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.367867 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.367679 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.367867 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.367740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w2d2\" (UniqueName: \"kubernetes.io/projected/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-kube-api-access-9w2d2\") pod \"message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.392405 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.392367 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76"] Apr 24 19:19:55.392723 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.392681 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" containerID="cri-o://74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a" gracePeriod=30 Apr 24 19:19:55.392813 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.392727 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kube-rbac-proxy" containerID="cri-o://c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac" gracePeriod=30 Apr 24 19:19:55.468799 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.468756 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9w2d2\" (UniqueName: \"kubernetes.io/projected/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-kube-api-access-9w2d2\") pod \"message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.468951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.468897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-proxy-tls\") pod \"message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.468951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.468938 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.469576 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.469550 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.471353 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.471330 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-proxy-tls\") pod \"message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.476781 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.476746 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w2d2\" (UniqueName: \"kubernetes.io/projected/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-kube-api-access-9w2d2\") pod \"message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.646181 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.646141 2568 generic.go:358] "Generic (PLEG): container finished" podID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerID="cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794" exitCode=2 Apr 24 19:19:55.646377 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.646220 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" event={"ID":"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9","Type":"ContainerDied","Data":"cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794"} Apr 24 19:19:55.648398 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.648364 2568 generic.go:358] "Generic (PLEG): container finished" podID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerID="c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac" exitCode=2 Apr 24 19:19:55.648506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.648410 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" event={"ID":"73108d1e-c23f-4757-b27f-857d1b472a6c","Type":"ContainerDied","Data":"c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac"} Apr 24 19:19:55.650673 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.650649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:55.779714 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:55.779688 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv"] Apr 24 19:19:55.785206 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:19:55.785175 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc364dfc2_55dc_4e24_b4b8_27eee8cc3660.slice/crio-a18a5d01522573a6293ab65df51691160c4cb0b3577a02eb929ed75de7c0deed WatchSource:0}: Error finding container a18a5d01522573a6293ab65df51691160c4cb0b3577a02eb929ed75de7c0deed: Status 404 returned error can't find the container with id a18a5d01522573a6293ab65df51691160c4cb0b3577a02eb929ed75de7c0deed Apr 24 19:19:56.652791 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:56.652750 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" event={"ID":"c364dfc2-55dc-4e24-b4b8-27eee8cc3660","Type":"ContainerStarted","Data":"a18a5d01522573a6293ab65df51691160c4cb0b3577a02eb929ed75de7c0deed"} Apr 24 19:19:57.657711 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:57.657668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" event={"ID":"c364dfc2-55dc-4e24-b4b8-27eee8cc3660","Type":"ContainerStarted","Data":"6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8"} Apr 24 19:19:57.658141 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:57.657719 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" event={"ID":"c364dfc2-55dc-4e24-b4b8-27eee8cc3660","Type":"ContainerStarted","Data":"a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345"} Apr 24 19:19:57.658141 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:57.657843 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:57.675915 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:57.675861 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" podStartSLOduration=1.60404284 podStartE2EDuration="2.675845064s" podCreationTimestamp="2026-04-24 19:19:55 +0000 UTC" firstStartedPulling="2026-04-24 19:19:55.786968517 +0000 UTC m=+798.898804842" lastFinishedPulling="2026-04-24 19:19:56.858770743 +0000 UTC m=+799.970607066" observedRunningTime="2026-04-24 19:19:57.67483726 +0000 UTC m=+800.786673606" watchObservedRunningTime="2026-04-24 19:19:57.675845064 +0000 UTC m=+800.787681410" Apr 24 19:19:58.295845 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:58.295795 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.43:8643/healthz\": dial tcp 10.132.0.43:8643: connect: connection refused" Apr 24 19:19:58.296056 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:58.295795 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 24 19:19:58.302557 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:58.302519 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:19:58.302660 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:58.302522 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:19:58.661749 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:58.661719 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:58.663637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:58.663614 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:19:59.242283 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.242257 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:19:59.306197 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.306087 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73108d1e-c23f-4757-b27f-857d1b472a6c-proxy-tls\") pod \"73108d1e-c23f-4757-b27f-857d1b472a6c\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " Apr 24 19:19:59.306197 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.306161 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73108d1e-c23f-4757-b27f-857d1b472a6c-kserve-provision-location\") pod \"73108d1e-c23f-4757-b27f-857d1b472a6c\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " Apr 24 19:19:59.306439 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.306203 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73108d1e-c23f-4757-b27f-857d1b472a6c-isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") pod \"73108d1e-c23f-4757-b27f-857d1b472a6c\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " Apr 24 19:19:59.306439 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.306276 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-428l4\" (UniqueName: \"kubernetes.io/projected/73108d1e-c23f-4757-b27f-857d1b472a6c-kube-api-access-428l4\") pod \"73108d1e-c23f-4757-b27f-857d1b472a6c\" (UID: \"73108d1e-c23f-4757-b27f-857d1b472a6c\") " Apr 24 19:19:59.306572 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.306511 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73108d1e-c23f-4757-b27f-857d1b472a6c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "73108d1e-c23f-4757-b27f-857d1b472a6c" (UID: "73108d1e-c23f-4757-b27f-857d1b472a6c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:19:59.306660 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.306635 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73108d1e-c23f-4757-b27f-857d1b472a6c-isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config") pod "73108d1e-c23f-4757-b27f-857d1b472a6c" (UID: "73108d1e-c23f-4757-b27f-857d1b472a6c"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:19:59.308375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.308344 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73108d1e-c23f-4757-b27f-857d1b472a6c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "73108d1e-c23f-4757-b27f-857d1b472a6c" (UID: "73108d1e-c23f-4757-b27f-857d1b472a6c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:19:59.308375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.308358 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73108d1e-c23f-4757-b27f-857d1b472a6c-kube-api-access-428l4" (OuterVolumeSpecName: "kube-api-access-428l4") pod "73108d1e-c23f-4757-b27f-857d1b472a6c" (UID: "73108d1e-c23f-4757-b27f-857d1b472a6c"). InnerVolumeSpecName "kube-api-access-428l4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:19:59.407064 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.407027 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73108d1e-c23f-4757-b27f-857d1b472a6c-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:19:59.407064 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.407061 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73108d1e-c23f-4757-b27f-857d1b472a6c-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:19:59.407064 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.407072 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73108d1e-c23f-4757-b27f-857d1b472a6c-isvc-xgboost-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:19:59.407329 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.407082 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-428l4\" (UniqueName: \"kubernetes.io/projected/73108d1e-c23f-4757-b27f-857d1b472a6c-kube-api-access-428l4\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:19:59.667193 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.667151 2568 generic.go:358] "Generic (PLEG): container finished" podID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerID="74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a" exitCode=0 Apr 24 19:19:59.667635 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.667227 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" Apr 24 19:19:59.667635 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.667243 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" event={"ID":"73108d1e-c23f-4757-b27f-857d1b472a6c","Type":"ContainerDied","Data":"74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a"} Apr 24 19:19:59.667635 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.667283 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76" event={"ID":"73108d1e-c23f-4757-b27f-857d1b472a6c","Type":"ContainerDied","Data":"5a99868d139b1f435f8d8df38921c2b3fe3a141358af6294fe338aa1a925636b"} Apr 24 19:19:59.667635 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.667301 2568 scope.go:117] "RemoveContainer" containerID="c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac" Apr 24 19:19:59.676037 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.676015 2568 scope.go:117] "RemoveContainer" containerID="74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a" Apr 24 19:19:59.688415 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.688382 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76"] Apr 24 19:19:59.691673 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.691654 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1911c-predictor-86b5f986fc-gqc76"] Apr 24 19:19:59.691741 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.691684 2568 scope.go:117] "RemoveContainer" containerID="91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a" Apr 24 19:19:59.699132 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.699093 2568 scope.go:117] "RemoveContainer" containerID="c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac" Apr 24 19:19:59.699392 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:19:59.699371 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac\": container with ID starting with c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac not found: ID does not exist" containerID="c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac" Apr 24 19:19:59.699476 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.699405 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac"} err="failed to get container status \"c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac\": rpc error: code = NotFound desc = could not find container \"c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac\": container with ID starting with c4f6664b08939af280be68b29dacbb1ed4a05dcc818e1d9dc4ea128d44b545ac not found: ID does not exist" Apr 24 19:19:59.699476 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.699434 2568 scope.go:117] "RemoveContainer" containerID="74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a" Apr 24 19:19:59.699720 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:19:59.699701 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a\": container with ID starting with 74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a not found: ID does not exist" containerID="74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a" Apr 24 19:19:59.699761 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.699729 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a"} err="failed to get container status \"74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a\": rpc error: code = NotFound desc = could not find container \"74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a\": container with ID starting with 74e8fb8ea788da8ce27878f524037e55c5a34e054c942d4b9f7f66110edd819a not found: ID does not exist" Apr 24 19:19:59.699761 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.699745 2568 scope.go:117] "RemoveContainer" containerID="91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a" Apr 24 19:19:59.699956 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:19:59.699938 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a\": container with ID starting with 91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a not found: ID does not exist" containerID="91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a" Apr 24 19:19:59.700004 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:19:59.699961 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a"} err="failed to get container status \"91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a\": rpc error: code = NotFound desc = could not find container \"91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a\": container with ID starting with 91387c78e2c1926bfb6a0339396b92a10a24c3b75a224946dc58b831fef43e7a not found: ID does not exist" Apr 24 19:20:00.030801 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.030776 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:20:00.114421 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.114385 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-proxy-tls\") pod \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " Apr 24 19:20:00.114421 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.114428 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjxks\" (UniqueName: \"kubernetes.io/projected/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kube-api-access-jjxks\") pod \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " Apr 24 19:20:00.114662 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.114467 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kserve-provision-location\") pod \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " Apr 24 19:20:00.114662 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.114498 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") pod \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\" (UID: \"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9\") " Apr 24 19:20:00.114795 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.114762 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" (UID: "7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:20:00.114866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.114818 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config") pod "7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" (UID: "7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:20:00.114924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.114894 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:20:00.114924 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.114911 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-isvc-sklearn-graph-raw-hpa-1911c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:20:00.116575 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.116552 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" (UID: "7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:20:00.116704 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.116682 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kube-api-access-jjxks" (OuterVolumeSpecName: "kube-api-access-jjxks") pod "7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" (UID: "7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9"). InnerVolumeSpecName "kube-api-access-jjxks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:20:00.216027 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.215941 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:20:00.216027 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.215971 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jjxks\" (UniqueName: \"kubernetes.io/projected/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9-kube-api-access-jjxks\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:20:00.672966 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.672931 2568 generic.go:358] "Generic (PLEG): container finished" podID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerID="37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990" exitCode=0 Apr 24 19:20:00.673421 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.673010 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" Apr 24 19:20:00.673421 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.673009 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" event={"ID":"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9","Type":"ContainerDied","Data":"37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990"} Apr 24 19:20:00.673421 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.673128 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f" event={"ID":"7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9","Type":"ContainerDied","Data":"93f3e59497dfc2f85b0824af1ca0bc9f404d1d90d2ffee6c05655ff4d3d805e1"} Apr 24 19:20:00.673421 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.673144 2568 scope.go:117] "RemoveContainer" containerID="cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794" Apr 24 19:20:00.683702 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.683679 2568 scope.go:117] "RemoveContainer" containerID="37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990" Apr 24 19:20:00.692645 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.692619 2568 scope.go:117] "RemoveContainer" containerID="48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0" Apr 24 19:20:00.695977 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.695951 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f"] Apr 24 19:20:00.699623 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.699597 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1911c-predictor-5f544595fd-fr92f"] Apr 24 19:20:00.701328 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.701308 2568 scope.go:117] "RemoveContainer" containerID="cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794" Apr 24 19:20:00.701582 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:20:00.701564 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794\": container with ID starting with cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794 not found: ID does not exist" containerID="cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794" Apr 24 19:20:00.701638 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.701594 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794"} err="failed to get container status \"cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794\": rpc error: code = NotFound desc = could not find container \"cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794\": container with ID starting with cefe46aca70e3536ce0e464b6af4d76f0df2c992088c65df3225309bde637794 not found: ID does not exist" Apr 24 19:20:00.701638 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.701619 2568 scope.go:117] "RemoveContainer" containerID="37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990" Apr 24 19:20:00.701877 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:20:00.701862 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990\": container with ID starting with 37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990 not found: ID does not exist" containerID="37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990" Apr 24 19:20:00.701923 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.701883 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990"} err="failed to get container status \"37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990\": rpc error: code = NotFound desc = could not find container \"37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990\": container with ID starting with 37476be68d375bc20034a21e8ecc872a9bbef84662aff6e60473fb14a9f24990 not found: ID does not exist" Apr 24 19:20:00.701923 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.701898 2568 scope.go:117] "RemoveContainer" containerID="48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0" Apr 24 19:20:00.702151 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:20:00.702128 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0\": container with ID starting with 48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0 not found: ID does not exist" containerID="48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0" Apr 24 19:20:00.702210 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:00.702160 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0"} err="failed to get container status \"48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0\": rpc error: code = NotFound desc = could not find container \"48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0\": container with ID starting with 48c71efff9a5b0cf50339374ebf298b45e6c94b7c4f343cf48a997887df701f0 not found: ID does not exist" Apr 24 19:20:01.459828 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:01.459792 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" path="/var/lib/kubelet/pods/73108d1e-c23f-4757-b27f-857d1b472a6c/volumes" Apr 24 19:20:01.460316 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:01.460302 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" path="/var/lib/kubelet/pods/7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9/volumes" Apr 24 19:20:05.677236 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:05.677204 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:20:15.382894 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.382862 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4"] Apr 24 19:20:15.383338 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383321 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="storage-initializer" Apr 24 19:20:15.383338 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383339 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="storage-initializer" Apr 24 19:20:15.383416 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383354 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" Apr 24 19:20:15.383416 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383361 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" Apr 24 19:20:15.383416 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383369 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="storage-initializer" Apr 24 19:20:15.383416 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383374 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="storage-initializer" Apr 24 19:20:15.383416 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383384 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" Apr 24 19:20:15.383416 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383389 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" Apr 24 19:20:15.383416 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383396 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kube-rbac-proxy" Apr 24 19:20:15.383416 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383401 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kube-rbac-proxy" Apr 24 19:20:15.383659 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383421 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kube-rbac-proxy" Apr 24 19:20:15.383659 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383427 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kube-rbac-proxy" Apr 24 19:20:15.383659 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383492 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kube-rbac-proxy" Apr 24 19:20:15.383659 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383500 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="73108d1e-c23f-4757-b27f-857d1b472a6c" containerName="kserve-container" Apr 24 19:20:15.383659 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383507 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kserve-container" Apr 24 19:20:15.383659 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.383519 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ca8ad97-5405-4afe-8fc6-c4a1f0c545c9" containerName="kube-rbac-proxy" Apr 24 19:20:15.388698 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.388674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.391254 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.391219 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-2ea81-predictor-serving-cert\"" Apr 24 19:20:15.391404 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.391376 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\"" Apr 24 19:20:15.398702 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.398672 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4"] Apr 24 19:20:15.564017 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.563973 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79dffc7a-0681-4abb-ba23-01470dfb0cec-isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.564236 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.564032 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5xkx\" (UniqueName: \"kubernetes.io/projected/79dffc7a-0681-4abb-ba23-01470dfb0cec-kube-api-access-g5xkx\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.564236 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.564089 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79dffc7a-0681-4abb-ba23-01470dfb0cec-proxy-tls\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.564236 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.564159 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79dffc7a-0681-4abb-ba23-01470dfb0cec-kserve-provision-location\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.665764 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.665668 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5xkx\" (UniqueName: \"kubernetes.io/projected/79dffc7a-0681-4abb-ba23-01470dfb0cec-kube-api-access-g5xkx\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.665764 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.665715 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79dffc7a-0681-4abb-ba23-01470dfb0cec-proxy-tls\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.665764 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.665742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79dffc7a-0681-4abb-ba23-01470dfb0cec-kserve-provision-location\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.666050 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.665861 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79dffc7a-0681-4abb-ba23-01470dfb0cec-isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.666303 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.666274 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79dffc7a-0681-4abb-ba23-01470dfb0cec-kserve-provision-location\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.666691 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.666666 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79dffc7a-0681-4abb-ba23-01470dfb0cec-isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.668504 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.668478 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79dffc7a-0681-4abb-ba23-01470dfb0cec-proxy-tls\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.673708 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.673660 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5xkx\" (UniqueName: \"kubernetes.io/projected/79dffc7a-0681-4abb-ba23-01470dfb0cec-kube-api-access-g5xkx\") pod \"isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.703163 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.703129 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:15.834814 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:15.834789 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4"] Apr 24 19:20:15.837288 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:20:15.837256 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79dffc7a_0681_4abb_ba23_01470dfb0cec.slice/crio-dcf8e3de76d0d8638ead7e5af246081531059c736ec42a4ba16940a3bd403381 WatchSource:0}: Error finding container dcf8e3de76d0d8638ead7e5af246081531059c736ec42a4ba16940a3bd403381: Status 404 returned error can't find the container with id dcf8e3de76d0d8638ead7e5af246081531059c736ec42a4ba16940a3bd403381 Apr 24 19:20:16.732964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:16.732925 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerStarted","Data":"a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8"} Apr 24 19:20:16.732964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:16.732965 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerStarted","Data":"dcf8e3de76d0d8638ead7e5af246081531059c736ec42a4ba16940a3bd403381"} Apr 24 19:20:19.744503 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:19.744463 2568 generic.go:358] "Generic (PLEG): container finished" podID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerID="a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8" exitCode=0 Apr 24 19:20:19.744882 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:19.744506 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerDied","Data":"a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8"} Apr 24 19:20:20.750898 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:20.750862 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerStarted","Data":"5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1"} Apr 24 19:20:20.750898 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:20.750900 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerStarted","Data":"1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99"} Apr 24 19:20:20.751375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:20.750910 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerStarted","Data":"6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac"} Apr 24 19:20:20.751375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:20.751206 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:20.751375 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:20.751358 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:20.752691 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:20.752665 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:20:20.773390 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:20.773343 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podStartSLOduration=5.77332651 podStartE2EDuration="5.77332651s" podCreationTimestamp="2026-04-24 19:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:20:20.772122582 +0000 UTC m=+823.883958924" watchObservedRunningTime="2026-04-24 19:20:20.77332651 +0000 UTC m=+823.885162855" Apr 24 19:20:21.754511 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:21.754474 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:21.754954 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:21.754512 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:20:21.755510 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:21.755484 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:22.758015 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:22.757958 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:20:22.758435 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:22.758320 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:27.763078 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:27.763047 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:20:27.763741 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:27.763697 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:20:27.764069 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:27.764048 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:37.764311 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:37.764208 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:20:37.764685 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:37.764641 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:47.764209 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:47.764155 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:20:47.764651 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:47.764618 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:57.764081 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:57.764037 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:20:57.764538 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:20:57.764453 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:21:07.763928 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:07.763879 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:21:07.764446 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:07.764331 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:21:17.764161 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:17.764110 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:21:17.764644 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:17.764592 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:21:27.765035 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:27.764952 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:21:27.765457 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:27.765175 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:21:37.415343 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:37.415310 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:21:37.419188 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:37.419165 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:21:37.419791 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:37.419765 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:21:37.423383 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:37.423364 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:21:40.420642 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.420609 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv_c364dfc2-55dc-4e24-b4b8-27eee8cc3660/kserve-container/0.log" Apr 24 19:21:40.591743 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.591706 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4"] Apr 24 19:21:40.592158 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.592046 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" containerID="cri-o://6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac" gracePeriod=30 Apr 24 19:21:40.592415 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.592065 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" containerID="cri-o://5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1" gracePeriod=30 Apr 24 19:21:40.592522 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.592142 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" containerID="cri-o://1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99" gracePeriod=30 Apr 24 19:21:40.631466 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.631437 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm"] Apr 24 19:21:40.635383 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.635360 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.638199 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.638174 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-dd3f9-predictor-serving-cert\"" Apr 24 19:21:40.638330 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.638206 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\"" Apr 24 19:21:40.645269 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.645239 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm"] Apr 24 19:21:40.687487 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.687397 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv"] Apr 24 19:21:40.687772 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.687744 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" podUID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerName="kserve-container" containerID="cri-o://a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345" gracePeriod=30 Apr 24 19:21:40.687891 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.687840 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" podUID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerName="kube-rbac-proxy" containerID="cri-o://6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8" gracePeriod=30 Apr 24 19:21:40.755534 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.755495 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8c51954e-999a-4851-9c45-16f29af69591-isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.755701 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.755559 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tk27\" (UniqueName: \"kubernetes.io/projected/8c51954e-999a-4851-9c45-16f29af69591-kube-api-access-7tk27\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.755701 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.755596 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c51954e-999a-4851-9c45-16f29af69591-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.755701 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.755633 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c51954e-999a-4851-9c45-16f29af69591-proxy-tls\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.856738 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.856695 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8c51954e-999a-4851-9c45-16f29af69591-isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.856886 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.856765 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tk27\" (UniqueName: \"kubernetes.io/projected/8c51954e-999a-4851-9c45-16f29af69591-kube-api-access-7tk27\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.856886 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.856795 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c51954e-999a-4851-9c45-16f29af69591-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.856886 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.856830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c51954e-999a-4851-9c45-16f29af69591-proxy-tls\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.857077 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:21:40.857040 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-serving-cert: secret "isvc-sklearn-scale-raw-dd3f9-predictor-serving-cert" not found Apr 24 19:21:40.857162 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:21:40.857130 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c51954e-999a-4851-9c45-16f29af69591-proxy-tls podName:8c51954e-999a-4851-9c45-16f29af69591 nodeName:}" failed. No retries permitted until 2026-04-24 19:21:41.357088774 +0000 UTC m=+904.468925112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8c51954e-999a-4851-9c45-16f29af69591-proxy-tls") pod "isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" (UID: "8c51954e-999a-4851-9c45-16f29af69591") : secret "isvc-sklearn-scale-raw-dd3f9-predictor-serving-cert" not found Apr 24 19:21:40.857271 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.857245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c51954e-999a-4851-9c45-16f29af69591-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.857434 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.857415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8c51954e-999a-4851-9c45-16f29af69591-isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.866845 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.866809 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tk27\" (UniqueName: \"kubernetes.io/projected/8c51954e-999a-4851-9c45-16f29af69591-kube-api-access-7tk27\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:40.924518 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:40.924490 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:21:41.052234 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.052133 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerDied","Data":"1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99"} Apr 24 19:21:41.052234 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.052135 2568 generic.go:358] "Generic (PLEG): container finished" podID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerID="1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99" exitCode=2 Apr 24 19:21:41.053587 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.053564 2568 generic.go:358] "Generic (PLEG): container finished" podID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerID="6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8" exitCode=2 Apr 24 19:21:41.053587 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.053583 2568 generic.go:358] "Generic (PLEG): container finished" podID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerID="a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345" exitCode=2 Apr 24 19:21:41.053744 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.053592 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" event={"ID":"c364dfc2-55dc-4e24-b4b8-27eee8cc3660","Type":"ContainerDied","Data":"6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8"} Apr 24 19:21:41.053744 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.053630 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" event={"ID":"c364dfc2-55dc-4e24-b4b8-27eee8cc3660","Type":"ContainerDied","Data":"a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345"} Apr 24 19:21:41.053744 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.053640 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" Apr 24 19:21:41.053744 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.053647 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv" event={"ID":"c364dfc2-55dc-4e24-b4b8-27eee8cc3660","Type":"ContainerDied","Data":"a18a5d01522573a6293ab65df51691160c4cb0b3577a02eb929ed75de7c0deed"} Apr 24 19:21:41.053744 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.053665 2568 scope.go:117] "RemoveContainer" containerID="6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8" Apr 24 19:21:41.058616 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.058592 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-proxy-tls\") pod \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " Apr 24 19:21:41.058741 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.058629 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\") pod \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " Apr 24 19:21:41.058741 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.058670 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w2d2\" (UniqueName: \"kubernetes.io/projected/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-kube-api-access-9w2d2\") pod \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\" (UID: \"c364dfc2-55dc-4e24-b4b8-27eee8cc3660\") " Apr 24 19:21:41.059017 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.058996 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-message-dumper-raw-2ea81-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-2ea81-kube-rbac-proxy-sar-config") pod "c364dfc2-55dc-4e24-b4b8-27eee8cc3660" (UID: "c364dfc2-55dc-4e24-b4b8-27eee8cc3660"). InnerVolumeSpecName "message-dumper-raw-2ea81-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:21:41.060606 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.060579 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c364dfc2-55dc-4e24-b4b8-27eee8cc3660" (UID: "c364dfc2-55dc-4e24-b4b8-27eee8cc3660"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:21:41.060722 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.060701 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-kube-api-access-9w2d2" (OuterVolumeSpecName: "kube-api-access-9w2d2") pod "c364dfc2-55dc-4e24-b4b8-27eee8cc3660" (UID: "c364dfc2-55dc-4e24-b4b8-27eee8cc3660"). InnerVolumeSpecName "kube-api-access-9w2d2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:21:41.062352 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.062339 2568 scope.go:117] "RemoveContainer" containerID="a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345" Apr 24 19:21:41.070001 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.069979 2568 scope.go:117] "RemoveContainer" containerID="6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8" Apr 24 19:21:41.070306 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:21:41.070289 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8\": container with ID starting with 6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8 not found: ID does not exist" containerID="6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8" Apr 24 19:21:41.070363 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.070315 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8"} err="failed to get container status \"6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8\": rpc error: code = NotFound desc = could not find container \"6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8\": container with ID starting with 6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8 not found: ID does not exist" Apr 24 19:21:41.070363 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.070332 2568 scope.go:117] "RemoveContainer" containerID="a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345" Apr 24 19:21:41.070588 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:21:41.070569 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345\": container with ID starting with a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345 not found: ID does not exist" containerID="a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345" Apr 24 19:21:41.070638 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.070596 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345"} err="failed to get container status \"a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345\": rpc error: code = NotFound desc = could not find container \"a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345\": container with ID starting with a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345 not found: ID does not exist" Apr 24 19:21:41.070638 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.070615 2568 scope.go:117] "RemoveContainer" containerID="6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8" Apr 24 19:21:41.070828 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.070808 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8"} err="failed to get container status \"6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8\": rpc error: code = NotFound desc = could not find container \"6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8\": container with ID starting with 6ddd242a6335d1220e15f0f62601222b218fb1554f3491d49c0ba883bd231fe8 not found: ID does not exist" Apr 24 19:21:41.070872 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.070828 2568 scope.go:117] "RemoveContainer" containerID="a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345" Apr 24 19:21:41.071024 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.071001 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345"} err="failed to get container status \"a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345\": rpc error: code = NotFound desc = could not find container \"a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345\": container with ID starting with a00a5c588af255c728769f468b552a925195530dac611704e8b8da1a0ec5a345 not found: ID does not exist" Apr 24 19:21:41.159672 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.159621 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:21:41.159672 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.159666 2568 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-message-dumper-raw-2ea81-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:21:41.159672 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.159678 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9w2d2\" (UniqueName: \"kubernetes.io/projected/c364dfc2-55dc-4e24-b4b8-27eee8cc3660-kube-api-access-9w2d2\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:21:41.362561 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.362516 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c51954e-999a-4851-9c45-16f29af69591-proxy-tls\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:41.365225 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.365200 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c51954e-999a-4851-9c45-16f29af69591-proxy-tls\") pod \"isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:41.380232 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.380192 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv"] Apr 24 19:21:41.381933 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.381905 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-2ea81-predictor-6b67b99d4c-rw8sv"] Apr 24 19:21:41.459982 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.459947 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" path="/var/lib/kubelet/pods/c364dfc2-55dc-4e24-b4b8-27eee8cc3660/volumes" Apr 24 19:21:41.548373 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.548330 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:41.687925 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:41.687897 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm"] Apr 24 19:21:41.690397 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:21:41.690369 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c51954e_999a_4851_9c45_16f29af69591.slice/crio-03e3bb1a8857fc5d9e1c7647c1c626178f86422b239110ce2a5c7d42b13b0703 WatchSource:0}: Error finding container 03e3bb1a8857fc5d9e1c7647c1c626178f86422b239110ce2a5c7d42b13b0703: Status 404 returned error can't find the container with id 03e3bb1a8857fc5d9e1c7647c1c626178f86422b239110ce2a5c7d42b13b0703 Apr 24 19:21:42.059709 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:42.059620 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" event={"ID":"8c51954e-999a-4851-9c45-16f29af69591","Type":"ContainerStarted","Data":"8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900"} Apr 24 19:21:42.059709 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:42.059656 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" event={"ID":"8c51954e-999a-4851-9c45-16f29af69591","Type":"ContainerStarted","Data":"03e3bb1a8857fc5d9e1c7647c1c626178f86422b239110ce2a5c7d42b13b0703"} Apr 24 19:21:42.758806 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:42.758761 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 24 19:21:46.076207 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:46.076172 2568 generic.go:358] "Generic (PLEG): container finished" podID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerID="6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac" exitCode=0 Apr 24 19:21:46.076673 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:46.076248 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerDied","Data":"6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac"} Apr 24 19:21:46.077516 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:46.077496 2568 generic.go:358] "Generic (PLEG): container finished" podID="8c51954e-999a-4851-9c45-16f29af69591" containerID="8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900" exitCode=0 Apr 24 19:21:46.077610 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:46.077566 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" event={"ID":"8c51954e-999a-4851-9c45-16f29af69591","Type":"ContainerDied","Data":"8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900"} Apr 24 19:21:47.083195 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:47.083156 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" event={"ID":"8c51954e-999a-4851-9c45-16f29af69591","Type":"ContainerStarted","Data":"e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a"} Apr 24 19:21:47.083195 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:47.083199 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" event={"ID":"8c51954e-999a-4851-9c45-16f29af69591","Type":"ContainerStarted","Data":"6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b"} Apr 24 19:21:47.083660 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:47.083516 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:47.083660 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:47.083648 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:47.084970 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:47.084946 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:21:47.109410 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:47.109358 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podStartSLOduration=7.109344933 podStartE2EDuration="7.109344933s" podCreationTimestamp="2026-04-24 19:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:21:47.104469016 +0000 UTC m=+910.216305361" watchObservedRunningTime="2026-04-24 19:21:47.109344933 +0000 UTC m=+910.221181279" Apr 24 19:21:47.758237 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:47.758194 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 24 19:21:47.763588 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:47.763559 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:21:47.763928 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:47.763908 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:21:48.087135 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:48.087009 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:21:52.758764 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:52.758709 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 24 19:21:52.759205 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:52.758846 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:21:53.092128 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:53.092032 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:21:53.092632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:53.092606 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:21:57.758301 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:57.758254 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 24 19:21:57.763719 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:57.763681 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:21:57.764008 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:21:57.763990 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:22:02.758669 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:02.758618 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 24 19:22:03.092973 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:03.092881 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:22:07.758790 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:07.758737 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 24 19:22:07.764218 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:07.764166 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:22:07.764382 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:07.764361 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:22:07.764543 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:07.764525 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:22:07.764607 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:07.764599 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:22:10.742353 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.742325 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:22:10.819855 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.819814 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79dffc7a-0681-4abb-ba23-01470dfb0cec-kserve-provision-location\") pod \"79dffc7a-0681-4abb-ba23-01470dfb0cec\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " Apr 24 19:22:10.820059 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.819870 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5xkx\" (UniqueName: \"kubernetes.io/projected/79dffc7a-0681-4abb-ba23-01470dfb0cec-kube-api-access-g5xkx\") pod \"79dffc7a-0681-4abb-ba23-01470dfb0cec\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " Apr 24 19:22:10.820059 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.819903 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79dffc7a-0681-4abb-ba23-01470dfb0cec-proxy-tls\") pod \"79dffc7a-0681-4abb-ba23-01470dfb0cec\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " Apr 24 19:22:10.820059 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.819949 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79dffc7a-0681-4abb-ba23-01470dfb0cec-isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\") pod \"79dffc7a-0681-4abb-ba23-01470dfb0cec\" (UID: \"79dffc7a-0681-4abb-ba23-01470dfb0cec\") " Apr 24 19:22:10.820275 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.820191 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79dffc7a-0681-4abb-ba23-01470dfb0cec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "79dffc7a-0681-4abb-ba23-01470dfb0cec" (UID: "79dffc7a-0681-4abb-ba23-01470dfb0cec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:22:10.820444 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.820418 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79dffc7a-0681-4abb-ba23-01470dfb0cec-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:22:10.820579 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.820465 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79dffc7a-0681-4abb-ba23-01470dfb0cec-isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config") pod "79dffc7a-0681-4abb-ba23-01470dfb0cec" (UID: "79dffc7a-0681-4abb-ba23-01470dfb0cec"). InnerVolumeSpecName "isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:22:10.822050 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.822026 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dffc7a-0681-4abb-ba23-01470dfb0cec-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "79dffc7a-0681-4abb-ba23-01470dfb0cec" (UID: "79dffc7a-0681-4abb-ba23-01470dfb0cec"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:22:10.822165 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.822080 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79dffc7a-0681-4abb-ba23-01470dfb0cec-kube-api-access-g5xkx" (OuterVolumeSpecName: "kube-api-access-g5xkx") pod "79dffc7a-0681-4abb-ba23-01470dfb0cec" (UID: "79dffc7a-0681-4abb-ba23-01470dfb0cec"). InnerVolumeSpecName "kube-api-access-g5xkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:22:10.921738 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.921702 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79dffc7a-0681-4abb-ba23-01470dfb0cec-isvc-logger-raw-2ea81-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:22:10.921738 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.921734 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5xkx\" (UniqueName: \"kubernetes.io/projected/79dffc7a-0681-4abb-ba23-01470dfb0cec-kube-api-access-g5xkx\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:22:10.921738 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:10.921746 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79dffc7a-0681-4abb-ba23-01470dfb0cec-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:22:11.171689 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.171647 2568 generic.go:358] "Generic (PLEG): container finished" podID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerID="5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1" exitCode=0 Apr 24 19:22:11.171893 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.171732 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerDied","Data":"5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1"} Apr 24 19:22:11.171893 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.171769 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" event={"ID":"79dffc7a-0681-4abb-ba23-01470dfb0cec","Type":"ContainerDied","Data":"dcf8e3de76d0d8638ead7e5af246081531059c736ec42a4ba16940a3bd403381"} Apr 24 19:22:11.171893 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.171785 2568 scope.go:117] "RemoveContainer" containerID="5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1" Apr 24 19:22:11.171893 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.171748 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4" Apr 24 19:22:11.180865 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.180846 2568 scope.go:117] "RemoveContainer" containerID="1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99" Apr 24 19:22:11.189076 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.189057 2568 scope.go:117] "RemoveContainer" containerID="6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac" Apr 24 19:22:11.195858 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.195826 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4"] Apr 24 19:22:11.198042 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.198016 2568 scope.go:117] "RemoveContainer" containerID="a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8" Apr 24 19:22:11.199515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.199492 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-2ea81-predictor-664ffb7f86-k5mr4"] Apr 24 19:22:11.208314 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.206863 2568 scope.go:117] "RemoveContainer" containerID="5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1" Apr 24 19:22:11.208314 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:22:11.207491 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1\": container with ID starting with 5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1 not found: ID does not exist" containerID="5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1" Apr 24 19:22:11.208314 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.207528 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1"} err="failed to get container status \"5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1\": rpc error: code = NotFound desc = could not find container \"5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1\": container with ID starting with 5eca91150807486de1222ce1e1b403fbc5381369a8ed4e0d7da73b641990dcb1 not found: ID does not exist" Apr 24 19:22:11.208314 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.207555 2568 scope.go:117] "RemoveContainer" containerID="1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99" Apr 24 19:22:11.208314 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:22:11.207846 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99\": container with ID starting with 1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99 not found: ID does not exist" containerID="1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99" Apr 24 19:22:11.208314 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.207877 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99"} err="failed to get container status \"1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99\": rpc error: code = NotFound desc = could not find container \"1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99\": container with ID starting with 1bd3616e2369524bc81f1d8a70ef1b825cc67aa66dcd04242f36359dc39e2c99 not found: ID does not exist" Apr 24 19:22:11.208314 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.207896 2568 scope.go:117] "RemoveContainer" containerID="6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac" Apr 24 19:22:11.208314 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:22:11.208284 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac\": container with ID starting with 6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac not found: ID does not exist" containerID="6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac" Apr 24 19:22:11.208644 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.208314 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac"} err="failed to get container status \"6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac\": rpc error: code = NotFound desc = could not find container \"6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac\": container with ID starting with 6b1f8e1e96b6c2c8cbfd556505c7a3cc5de3a377a0047439332ea9aff38f0bac not found: ID does not exist" Apr 24 19:22:11.208644 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.208334 2568 scope.go:117] "RemoveContainer" containerID="a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8" Apr 24 19:22:11.208802 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:22:11.208785 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8\": container with ID starting with a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8 not found: ID does not exist" containerID="a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8" Apr 24 19:22:11.208859 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.208808 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8"} err="failed to get container status \"a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8\": rpc error: code = NotFound desc = could not find container \"a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8\": container with ID starting with a8398797343b0ccb29359ec638eb8a07757c52c8448449445a0719fbfdcafdc8 not found: ID does not exist" Apr 24 19:22:11.465307 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:11.465208 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" path="/var/lib/kubelet/pods/79dffc7a-0681-4abb-ba23-01470dfb0cec/volumes" Apr 24 19:22:13.093632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:13.093585 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:22:23.093506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:23.093463 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:22:33.092749 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:33.092707 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:22:37.454396 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:37.454358 2568 scope.go:117] "RemoveContainer" containerID="b8f3fd15106c77c72092873aa8c0c90b1dd51af862a0008045775a96596fb943" Apr 24 19:22:43.093546 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:43.093503 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:22:53.092567 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:22:53.092525 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:23:03.093256 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:23:03.093215 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:23:13.092728 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:23:13.092681 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:23:13.455263 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:23:13.455218 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:23:23.455648 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:23:23.455607 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:23:33.455721 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:23:33.455675 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:23:37.476615 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:23:37.476585 2568 scope.go:117] "RemoveContainer" containerID="075b52bd8e6724e2f24316054720e2f6963bb6f670d70818f0653d3135f36d8d" Apr 24 19:23:37.484908 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:23:37.484883 2568 scope.go:117] "RemoveContainer" containerID="80737b72386b452129c30ef6e7ea0c8fb94bf25cea6af4efbff45faa48a588f1" Apr 24 19:23:43.455987 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:23:43.455944 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:23:53.456214 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:23:53.456170 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:24:03.459395 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:03.459361 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:24:10.813072 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.813030 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm"] Apr 24 19:24:10.813682 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.813389 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" containerID="cri-o://6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b" gracePeriod=30 Apr 24 19:24:10.813682 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.813434 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kube-rbac-proxy" containerID="cri-o://e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a" gracePeriod=30 Apr 24 19:24:10.892370 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892333 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b"] Apr 24 19:24:10.892764 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892747 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerName="kserve-container" Apr 24 19:24:10.892764 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892766 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerName="kserve-container" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892775 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="storage-initializer" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892781 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="storage-initializer" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892789 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892796 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892810 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerName="kube-rbac-proxy" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892820 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerName="kube-rbac-proxy" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892839 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892845 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892852 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" Apr 24 19:24:10.892887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892857 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" Apr 24 19:24:10.893323 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892915 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kserve-container" Apr 24 19:24:10.893323 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892924 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="kube-rbac-proxy" Apr 24 19:24:10.893323 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892930 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerName="kserve-container" Apr 24 19:24:10.893323 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892942 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="79dffc7a-0681-4abb-ba23-01470dfb0cec" containerName="agent" Apr 24 19:24:10.893323 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.892950 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c364dfc2-55dc-4e24-b4b8-27eee8cc3660" containerName="kube-rbac-proxy" Apr 24 19:24:10.896599 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.896578 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:10.899364 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.899335 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-b4ed0b-predictor-serving-cert\"" Apr 24 19:24:10.899486 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.899348 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\"" Apr 24 19:24:10.908233 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.908205 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b"] Apr 24 19:24:10.991244 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.991194 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:10.991460 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.991358 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kserve-provision-location\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:10.991460 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.991410 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-proxy-tls\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:10.991567 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:10.991464 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkgm7\" (UniqueName: \"kubernetes.io/projected/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kube-api-access-bkgm7\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.092752 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.092651 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kserve-provision-location\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.092752 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.092704 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-proxy-tls\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.092752 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.092738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkgm7\" (UniqueName: \"kubernetes.io/projected/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kube-api-access-bkgm7\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.093058 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.092763 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.093058 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:24:11.092867 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-serving-cert: secret "isvc-primary-b4ed0b-predictor-serving-cert" not found Apr 24 19:24:11.093058 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:24:11.092949 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-proxy-tls podName:a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86 nodeName:}" failed. No retries permitted until 2026-04-24 19:24:11.592927526 +0000 UTC m=+1054.704763864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-proxy-tls") pod "isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" (UID: "a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86") : secret "isvc-primary-b4ed0b-predictor-serving-cert" not found Apr 24 19:24:11.093251 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.093144 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kserve-provision-location\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.093477 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.093457 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.101819 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.101790 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkgm7\" (UniqueName: \"kubernetes.io/projected/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kube-api-access-bkgm7\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.596964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.596923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-proxy-tls\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.599535 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.599508 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-proxy-tls\") pod \"isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.619704 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.619671 2568 generic.go:358] "Generic (PLEG): container finished" podID="8c51954e-999a-4851-9c45-16f29af69591" containerID="e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a" exitCode=2 Apr 24 19:24:11.619844 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.619739 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" event={"ID":"8c51954e-999a-4851-9c45-16f29af69591","Type":"ContainerDied","Data":"e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a"} Apr 24 19:24:11.808039 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.807994 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:11.938334 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.938307 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b"] Apr 24 19:24:11.940818 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:24:11.940790 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60bdf07_5eab_4ed3_9e8a_3f0f42af0a86.slice/crio-1940c777f1ac08b77df76025b18231250595d24b0c0d8aa8662b7c5de3634d68 WatchSource:0}: Error finding container 1940c777f1ac08b77df76025b18231250595d24b0c0d8aa8662b7c5de3634d68: Status 404 returned error can't find the container with id 1940c777f1ac08b77df76025b18231250595d24b0c0d8aa8662b7c5de3634d68 Apr 24 19:24:11.942874 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:11.942856 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:24:12.624925 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:12.624885 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" event={"ID":"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86","Type":"ContainerStarted","Data":"aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb"} Apr 24 19:24:12.624925 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:12.624924 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" event={"ID":"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86","Type":"ContainerStarted","Data":"1940c777f1ac08b77df76025b18231250595d24b0c0d8aa8662b7c5de3634d68"} Apr 24 19:24:13.087925 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:13.087826 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 24 19:24:13.455771 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:13.455725 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:24:16.642817 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:16.642779 2568 generic.go:358] "Generic (PLEG): container finished" podID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerID="aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb" exitCode=0 Apr 24 19:24:16.643297 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:16.642852 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" event={"ID":"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86","Type":"ContainerDied","Data":"aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb"} Apr 24 19:24:17.647667 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:17.647621 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" event={"ID":"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86","Type":"ContainerStarted","Data":"094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8"} Apr 24 19:24:17.647667 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:17.647668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" event={"ID":"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86","Type":"ContainerStarted","Data":"8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1"} Apr 24 19:24:17.648210 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:17.647867 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:17.667966 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:17.667925 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podStartSLOduration=7.667912346 podStartE2EDuration="7.667912346s" podCreationTimestamp="2026-04-24 19:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:24:17.666364983 +0000 UTC m=+1060.778201329" watchObservedRunningTime="2026-04-24 19:24:17.667912346 +0000 UTC m=+1060.779748728" Apr 24 19:24:18.087964 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:18.087860 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 24 19:24:18.651074 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:18.651039 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:18.652540 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:18.652512 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 24 19:24:19.654256 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:19.654221 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 24 19:24:20.268480 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.268456 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:24:20.380033 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.379996 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tk27\" (UniqueName: \"kubernetes.io/projected/8c51954e-999a-4851-9c45-16f29af69591-kube-api-access-7tk27\") pod \"8c51954e-999a-4851-9c45-16f29af69591\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " Apr 24 19:24:20.380033 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.380034 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c51954e-999a-4851-9c45-16f29af69591-proxy-tls\") pod \"8c51954e-999a-4851-9c45-16f29af69591\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " Apr 24 19:24:20.380294 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.380083 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c51954e-999a-4851-9c45-16f29af69591-kserve-provision-location\") pod \"8c51954e-999a-4851-9c45-16f29af69591\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " Apr 24 19:24:20.380294 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.380145 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8c51954e-999a-4851-9c45-16f29af69591-isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\") pod \"8c51954e-999a-4851-9c45-16f29af69591\" (UID: \"8c51954e-999a-4851-9c45-16f29af69591\") " Apr 24 19:24:20.380510 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.380482 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c51954e-999a-4851-9c45-16f29af69591-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8c51954e-999a-4851-9c45-16f29af69591" (UID: "8c51954e-999a-4851-9c45-16f29af69591"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:24:20.380579 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.380503 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c51954e-999a-4851-9c45-16f29af69591-isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config") pod "8c51954e-999a-4851-9c45-16f29af69591" (UID: "8c51954e-999a-4851-9c45-16f29af69591"). InnerVolumeSpecName "isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:24:20.382141 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.382118 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c51954e-999a-4851-9c45-16f29af69591-kube-api-access-7tk27" (OuterVolumeSpecName: "kube-api-access-7tk27") pod "8c51954e-999a-4851-9c45-16f29af69591" (UID: "8c51954e-999a-4851-9c45-16f29af69591"). InnerVolumeSpecName "kube-api-access-7tk27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:24:20.382276 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.382257 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c51954e-999a-4851-9c45-16f29af69591-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8c51954e-999a-4851-9c45-16f29af69591" (UID: "8c51954e-999a-4851-9c45-16f29af69591"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:24:20.481226 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.481187 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7tk27\" (UniqueName: \"kubernetes.io/projected/8c51954e-999a-4851-9c45-16f29af69591-kube-api-access-7tk27\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:24:20.481226 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.481223 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c51954e-999a-4851-9c45-16f29af69591-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:24:20.481226 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.481234 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c51954e-999a-4851-9c45-16f29af69591-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:24:20.481472 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.481244 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8c51954e-999a-4851-9c45-16f29af69591-isvc-sklearn-scale-raw-dd3f9-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:24:20.659829 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.659736 2568 generic.go:358] "Generic (PLEG): container finished" podID="8c51954e-999a-4851-9c45-16f29af69591" containerID="6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b" exitCode=0 Apr 24 19:24:20.659829 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.659822 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" Apr 24 19:24:20.660324 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.659830 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" event={"ID":"8c51954e-999a-4851-9c45-16f29af69591","Type":"ContainerDied","Data":"6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b"} Apr 24 19:24:20.660324 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.659864 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm" event={"ID":"8c51954e-999a-4851-9c45-16f29af69591","Type":"ContainerDied","Data":"03e3bb1a8857fc5d9e1c7647c1c626178f86422b239110ce2a5c7d42b13b0703"} Apr 24 19:24:20.660324 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.659879 2568 scope.go:117] "RemoveContainer" containerID="e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a" Apr 24 19:24:20.668994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.668975 2568 scope.go:117] "RemoveContainer" containerID="6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b" Apr 24 19:24:20.677268 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.677245 2568 scope.go:117] "RemoveContainer" containerID="8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900" Apr 24 19:24:20.682335 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.682305 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm"] Apr 24 19:24:20.685974 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.685954 2568 scope.go:117] "RemoveContainer" containerID="e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a" Apr 24 19:24:20.685974 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.685966 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-dd3f9-predictor-7c6474bc68-4rdgm"] Apr 24 19:24:20.686327 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:24:20.686302 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a\": container with ID starting with e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a not found: ID does not exist" containerID="e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a" Apr 24 19:24:20.686391 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.686334 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a"} err="failed to get container status \"e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a\": rpc error: code = NotFound desc = could not find container \"e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a\": container with ID starting with e8cb00ac6e3dda4421d6137d8a57e4f9fd0ef288577cfee87b292d2b2cc58a4a not found: ID does not exist" Apr 24 19:24:20.686391 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.686353 2568 scope.go:117] "RemoveContainer" containerID="6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b" Apr 24 19:24:20.686595 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:24:20.686578 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b\": container with ID starting with 6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b not found: ID does not exist" containerID="6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b" Apr 24 19:24:20.686645 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.686600 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b"} err="failed to get container status \"6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b\": rpc error: code = NotFound desc = could not find container \"6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b\": container with ID starting with 6f5c2838e639c4ea298b386cdde123b5e3e07e5a478452974565deac08489b0b not found: ID does not exist" Apr 24 19:24:20.686645 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.686618 2568 scope.go:117] "RemoveContainer" containerID="8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900" Apr 24 19:24:20.686882 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:24:20.686865 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900\": container with ID starting with 8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900 not found: ID does not exist" containerID="8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900" Apr 24 19:24:20.686938 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:20.686886 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900"} err="failed to get container status \"8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900\": rpc error: code = NotFound desc = could not find container \"8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900\": container with ID starting with 8884c3a4b5d8b25f522a2b04034b8f710512e1cc04e1afe36b137202d7e09900 not found: ID does not exist" Apr 24 19:24:21.459185 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:21.459146 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c51954e-999a-4851-9c45-16f29af69591" path="/var/lib/kubelet/pods/8c51954e-999a-4851-9c45-16f29af69591/volumes" Apr 24 19:24:24.658927 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:24.658843 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:24:24.659478 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:24.659447 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 24 19:24:34.659470 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:34.659431 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 24 19:24:44.659712 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:44.659666 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 24 19:24:54.659955 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:24:54.659911 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 24 19:25:04.660334 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:04.660288 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 24 19:25:14.659872 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:14.659831 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 24 19:25:24.660271 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:24.660234 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:25:31.084792 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.084750 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85"] Apr 24 19:25:31.085250 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.085234 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="storage-initializer" Apr 24 19:25:31.085310 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.085252 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="storage-initializer" Apr 24 19:25:31.085310 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.085264 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" Apr 24 19:25:31.085310 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.085270 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" Apr 24 19:25:31.085310 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.085292 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kube-rbac-proxy" Apr 24 19:25:31.085310 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.085300 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kube-rbac-proxy" Apr 24 19:25:31.085470 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.085363 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kserve-container" Apr 24 19:25:31.085470 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.085383 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c51954e-999a-4851-9c45-16f29af69591" containerName="kube-rbac-proxy" Apr 24 19:25:31.088777 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.088760 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.091483 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.091461 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-b4ed0b-predictor-serving-cert\"" Apr 24 19:25:31.091483 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.091479 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 19:25:31.091708 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.091494 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-b4ed0b\"" Apr 24 19:25:31.091708 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.091566 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-b4ed0b-dockercfg-wqg6q\"" Apr 24 19:25:31.091708 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.091479 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\"" Apr 24 19:25:31.097342 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.097319 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85"] Apr 24 19:25:31.212775 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.212739 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/633f22e8-63a8-478d-bd5f-44155c31cdc7-kserve-provision-location\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.212775 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.212778 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-cabundle-cert\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.213050 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.212833 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.213050 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.212881 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpv2l\" (UniqueName: \"kubernetes.io/projected/633f22e8-63a8-478d-bd5f-44155c31cdc7-kube-api-access-bpv2l\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.213050 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.212914 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/633f22e8-63a8-478d-bd5f-44155c31cdc7-proxy-tls\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.314203 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.314154 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.314399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.314214 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpv2l\" (UniqueName: \"kubernetes.io/projected/633f22e8-63a8-478d-bd5f-44155c31cdc7-kube-api-access-bpv2l\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.314399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.314250 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/633f22e8-63a8-478d-bd5f-44155c31cdc7-proxy-tls\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.314399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.314279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/633f22e8-63a8-478d-bd5f-44155c31cdc7-kserve-provision-location\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.314399 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.314301 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-cabundle-cert\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.314748 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.314720 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/633f22e8-63a8-478d-bd5f-44155c31cdc7-kserve-provision-location\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.314893 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.314874 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.314937 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.314920 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-cabundle-cert\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.316867 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.316845 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/633f22e8-63a8-478d-bd5f-44155c31cdc7-proxy-tls\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.323059 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.323032 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpv2l\" (UniqueName: \"kubernetes.io/projected/633f22e8-63a8-478d-bd5f-44155c31cdc7-kube-api-access-bpv2l\") pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.400125 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.400063 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:31.536260 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.536227 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85"] Apr 24 19:25:31.538270 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:25:31.538239 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod633f22e8_63a8_478d_bd5f_44155c31cdc7.slice/crio-fcfa2f4e4d3ddd0f91b48e2a4a5e13723dbdb181e34802b1529b8632ab7723c4 WatchSource:0}: Error finding container fcfa2f4e4d3ddd0f91b48e2a4a5e13723dbdb181e34802b1529b8632ab7723c4: Status 404 returned error can't find the container with id fcfa2f4e4d3ddd0f91b48e2a4a5e13723dbdb181e34802b1529b8632ab7723c4 Apr 24 19:25:31.909637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.909600 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" event={"ID":"633f22e8-63a8-478d-bd5f-44155c31cdc7","Type":"ContainerStarted","Data":"2f7fa3a3be80a058b83b412753dec276496c58a8fbf522292432fcc3d7d67ca2"} Apr 24 19:25:31.909637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:31.909638 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" event={"ID":"633f22e8-63a8-478d-bd5f-44155c31cdc7","Type":"ContainerStarted","Data":"fcfa2f4e4d3ddd0f91b48e2a4a5e13723dbdb181e34802b1529b8632ab7723c4"} Apr 24 19:25:35.924537 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:35.924501 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b4ed0b-predictor-58497c968-9lk85_633f22e8-63a8-478d-bd5f-44155c31cdc7/storage-initializer/0.log" Apr 24 19:25:35.924933 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:35.924544 2568 generic.go:358] "Generic (PLEG): container finished" podID="633f22e8-63a8-478d-bd5f-44155c31cdc7" containerID="2f7fa3a3be80a058b83b412753dec276496c58a8fbf522292432fcc3d7d67ca2" exitCode=1 Apr 24 19:25:35.924933 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:35.924621 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" event={"ID":"633f22e8-63a8-478d-bd5f-44155c31cdc7","Type":"ContainerDied","Data":"2f7fa3a3be80a058b83b412753dec276496c58a8fbf522292432fcc3d7d67ca2"} Apr 24 19:25:36.929591 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:36.929560 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b4ed0b-predictor-58497c968-9lk85_633f22e8-63a8-478d-bd5f-44155c31cdc7/storage-initializer/0.log" Apr 24 19:25:36.929993 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:36.929682 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" event={"ID":"633f22e8-63a8-478d-bd5f-44155c31cdc7","Type":"ContainerStarted","Data":"27eae2d377f4fb1cb383193224c7310db23ff4c9f47e89e39cdebe760a387563"} Apr 24 19:25:42.951971 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:42.951943 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b4ed0b-predictor-58497c968-9lk85_633f22e8-63a8-478d-bd5f-44155c31cdc7/storage-initializer/1.log" Apr 24 19:25:42.952404 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:42.952331 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b4ed0b-predictor-58497c968-9lk85_633f22e8-63a8-478d-bd5f-44155c31cdc7/storage-initializer/0.log" Apr 24 19:25:42.952404 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:42.952363 2568 generic.go:358] "Generic (PLEG): container finished" podID="633f22e8-63a8-478d-bd5f-44155c31cdc7" containerID="27eae2d377f4fb1cb383193224c7310db23ff4c9f47e89e39cdebe760a387563" exitCode=1 Apr 24 19:25:42.952404 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:42.952390 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" event={"ID":"633f22e8-63a8-478d-bd5f-44155c31cdc7","Type":"ContainerDied","Data":"27eae2d377f4fb1cb383193224c7310db23ff4c9f47e89e39cdebe760a387563"} Apr 24 19:25:42.952517 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:42.952418 2568 scope.go:117] "RemoveContainer" containerID="2f7fa3a3be80a058b83b412753dec276496c58a8fbf522292432fcc3d7d67ca2" Apr 24 19:25:42.952874 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:42.952848 2568 scope.go:117] "RemoveContainer" containerID="2f7fa3a3be80a058b83b412753dec276496c58a8fbf522292432fcc3d7d67ca2" Apr 24 19:25:42.963633 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:42.963603 2568 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b4ed0b-predictor-58497c968-9lk85_kserve-ci-e2e-test_633f22e8-63a8-478d-bd5f-44155c31cdc7_0 in pod sandbox fcfa2f4e4d3ddd0f91b48e2a4a5e13723dbdb181e34802b1529b8632ab7723c4 from index: no such id: '2f7fa3a3be80a058b83b412753dec276496c58a8fbf522292432fcc3d7d67ca2'" containerID="2f7fa3a3be80a058b83b412753dec276496c58a8fbf522292432fcc3d7d67ca2" Apr 24 19:25:42.963754 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:42.963655 2568 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b4ed0b-predictor-58497c968-9lk85_kserve-ci-e2e-test_633f22e8-63a8-478d-bd5f-44155c31cdc7_0 in pod sandbox fcfa2f4e4d3ddd0f91b48e2a4a5e13723dbdb181e34802b1529b8632ab7723c4 from index: no such id: '2f7fa3a3be80a058b83b412753dec276496c58a8fbf522292432fcc3d7d67ca2'; Skipping pod \"isvc-secondary-b4ed0b-predictor-58497c968-9lk85_kserve-ci-e2e-test(633f22e8-63a8-478d-bd5f-44155c31cdc7)\"" logger="UnhandledError" Apr 24 19:25:42.965301 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:42.965277 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-b4ed0b-predictor-58497c968-9lk85_kserve-ci-e2e-test(633f22e8-63a8-478d-bd5f-44155c31cdc7)\"" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" podUID="633f22e8-63a8-478d-bd5f-44155c31cdc7" Apr 24 19:25:43.957347 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:43.957317 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b4ed0b-predictor-58497c968-9lk85_633f22e8-63a8-478d-bd5f-44155c31cdc7/storage-initializer/1.log" Apr 24 19:25:47.112152 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.112090 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85"] Apr 24 19:25:47.218866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.218829 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b"] Apr 24 19:25:47.219188 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.219160 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" containerID="cri-o://8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1" gracePeriod=30 Apr 24 19:25:47.219308 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.219201 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kube-rbac-proxy" containerID="cri-o://094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8" gracePeriod=30 Apr 24 19:25:47.302308 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.302271 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x"] Apr 24 19:25:47.307425 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.307400 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.311394 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.311366 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-bf4c24-predictor-serving-cert\"" Apr 24 19:25:47.311394 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.311375 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\"" Apr 24 19:25:47.311394 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.311378 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-bf4c24\"" Apr 24 19:25:47.311639 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.311406 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-bf4c24-dockercfg-cczs2\"" Apr 24 19:25:47.319069 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.319045 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x"] Apr 24 19:25:47.349063 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.349040 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b4ed0b-predictor-58497c968-9lk85_633f22e8-63a8-478d-bd5f-44155c31cdc7/storage-initializer/1.log" Apr 24 19:25:47.349233 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.349121 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:47.461696 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.461623 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/633f22e8-63a8-478d-bd5f-44155c31cdc7-proxy-tls\") pod \"633f22e8-63a8-478d-bd5f-44155c31cdc7\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " Apr 24 19:25:47.461848 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.461694 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/633f22e8-63a8-478d-bd5f-44155c31cdc7-kserve-provision-location\") pod \"633f22e8-63a8-478d-bd5f-44155c31cdc7\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " Apr 24 19:25:47.461848 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.461745 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-cabundle-cert\") pod \"633f22e8-63a8-478d-bd5f-44155c31cdc7\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " Apr 24 19:25:47.461848 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.461828 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpv2l\" (UniqueName: \"kubernetes.io/projected/633f22e8-63a8-478d-bd5f-44155c31cdc7-kube-api-access-bpv2l\") pod \"633f22e8-63a8-478d-bd5f-44155c31cdc7\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " Apr 24 19:25:47.462023 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.461882 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\") pod \"633f22e8-63a8-478d-bd5f-44155c31cdc7\" (UID: \"633f22e8-63a8-478d-bd5f-44155c31cdc7\") " Apr 24 19:25:47.462023 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462005 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/633f22e8-63a8-478d-bd5f-44155c31cdc7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "633f22e8-63a8-478d-bd5f-44155c31cdc7" (UID: "633f22e8-63a8-478d-bd5f-44155c31cdc7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:25:47.462175 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kserve-provision-location\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.462175 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462145 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "633f22e8-63a8-478d-bd5f-44155c31cdc7" (UID: "633f22e8-63a8-478d-bd5f-44155c31cdc7"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:47.462175 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462154 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-proxy-tls\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.462309 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462257 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-cabundle-cert\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.462352 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462322 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config") pod "633f22e8-63a8-478d-bd5f-44155c31cdc7" (UID: "633f22e8-63a8-478d-bd5f-44155c31cdc7"). InnerVolumeSpecName "isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:47.462395 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462380 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.462435 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462401 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqcn\" (UniqueName: \"kubernetes.io/projected/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kube-api-access-8zqcn\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.462487 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462475 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/633f22e8-63a8-478d-bd5f-44155c31cdc7-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:47.462529 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462490 2568 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-cabundle-cert\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:47.462529 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.462502 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/633f22e8-63a8-478d-bd5f-44155c31cdc7-isvc-secondary-b4ed0b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:47.463797 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.463780 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/633f22e8-63a8-478d-bd5f-44155c31cdc7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "633f22e8-63a8-478d-bd5f-44155c31cdc7" (UID: "633f22e8-63a8-478d-bd5f-44155c31cdc7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:25:47.463907 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.463890 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/633f22e8-63a8-478d-bd5f-44155c31cdc7-kube-api-access-bpv2l" (OuterVolumeSpecName: "kube-api-access-bpv2l") pod "633f22e8-63a8-478d-bd5f-44155c31cdc7" (UID: "633f22e8-63a8-478d-bd5f-44155c31cdc7"). InnerVolumeSpecName "kube-api-access-bpv2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:25:47.563253 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.563195 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kserve-provision-location\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.563461 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.563290 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-proxy-tls\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.563461 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.563322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-cabundle-cert\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.563461 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.563394 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.563461 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.563422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqcn\" (UniqueName: \"kubernetes.io/projected/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kube-api-access-8zqcn\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.563685 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.563476 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/633f22e8-63a8-478d-bd5f-44155c31cdc7-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:47.563685 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.563494 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpv2l\" (UniqueName: \"kubernetes.io/projected/633f22e8-63a8-478d-bd5f-44155c31cdc7-kube-api-access-bpv2l\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:47.563685 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.563648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kserve-provision-location\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.564028 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.564007 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-cabundle-cert\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.564077 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.564046 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.565800 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.565780 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-proxy-tls\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.571574 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.571552 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqcn\" (UniqueName: \"kubernetes.io/projected/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kube-api-access-8zqcn\") pod \"isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.618308 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.618262 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:47.753396 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.753367 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x"] Apr 24 19:25:47.756028 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:25:47.756000 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe8bc7bd_935c_4317_9d0c_acdbc9d2f5de.slice/crio-3778c5c13e80468b5f35558def2f0ffb38246ca3ff28f081b2f18e3532756693 WatchSource:0}: Error finding container 3778c5c13e80468b5f35558def2f0ffb38246ca3ff28f081b2f18e3532756693: Status 404 returned error can't find the container with id 3778c5c13e80468b5f35558def2f0ffb38246ca3ff28f081b2f18e3532756693 Apr 24 19:25:47.981926 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.981881 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" event={"ID":"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de","Type":"ContainerStarted","Data":"4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a"} Apr 24 19:25:47.981926 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.981929 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" event={"ID":"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de","Type":"ContainerStarted","Data":"3778c5c13e80468b5f35558def2f0ffb38246ca3ff28f081b2f18e3532756693"} Apr 24 19:25:47.983226 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.983206 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b4ed0b-predictor-58497c968-9lk85_633f22e8-63a8-478d-bd5f-44155c31cdc7/storage-initializer/1.log" Apr 24 19:25:47.983360 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.983334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" event={"ID":"633f22e8-63a8-478d-bd5f-44155c31cdc7","Type":"ContainerDied","Data":"fcfa2f4e4d3ddd0f91b48e2a4a5e13723dbdb181e34802b1529b8632ab7723c4"} Apr 24 19:25:47.983422 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.983350 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85" Apr 24 19:25:47.983422 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.983382 2568 scope.go:117] "RemoveContainer" containerID="27eae2d377f4fb1cb383193224c7310db23ff4c9f47e89e39cdebe760a387563" Apr 24 19:25:47.985519 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.985487 2568 generic.go:358] "Generic (PLEG): container finished" podID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerID="094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8" exitCode=2 Apr 24 19:25:47.985635 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:47.985522 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" event={"ID":"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86","Type":"ContainerDied","Data":"094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8"} Apr 24 19:25:48.039116 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:48.039001 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85"] Apr 24 19:25:48.041868 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:48.041840 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b4ed0b-predictor-58497c968-9lk85"] Apr 24 19:25:49.460055 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:49.460019 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="633f22e8-63a8-478d-bd5f-44155c31cdc7" path="/var/lib/kubelet/pods/633f22e8-63a8-478d-bd5f-44155c31cdc7/volumes" Apr 24 19:25:49.655340 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:49.655292 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.47:8643/healthz\": dial tcp 10.132.0.47:8643: connect: connection refused" Apr 24 19:25:51.765177 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:51.765154 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:25:51.904196 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:51.904071 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\") pod \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " Apr 24 19:25:51.904196 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:51.904176 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkgm7\" (UniqueName: \"kubernetes.io/projected/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kube-api-access-bkgm7\") pod \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " Apr 24 19:25:51.904414 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:51.904253 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-proxy-tls\") pod \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " Apr 24 19:25:51.904414 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:51.904283 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kserve-provision-location\") pod \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\" (UID: \"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86\") " Apr 24 19:25:51.904611 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:51.904529 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-isvc-primary-b4ed0b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-b4ed0b-kube-rbac-proxy-sar-config") pod "a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" (UID: "a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86"). InnerVolumeSpecName "isvc-primary-b4ed0b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:51.904686 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:51.904659 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" (UID: "a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:25:51.906355 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:51.906329 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" (UID: "a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:25:51.906454 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:51.906369 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kube-api-access-bkgm7" (OuterVolumeSpecName: "kube-api-access-bkgm7") pod "a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" (UID: "a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86"). InnerVolumeSpecName "kube-api-access-bkgm7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:25:52.002554 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.002524 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x_be8bc7bd-935c-4317-9d0c-acdbc9d2f5de/storage-initializer/0.log" Apr 24 19:25:52.002730 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.002563 2568 generic.go:358] "Generic (PLEG): container finished" podID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" containerID="4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a" exitCode=1 Apr 24 19:25:52.002730 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.002641 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" event={"ID":"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de","Type":"ContainerDied","Data":"4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a"} Apr 24 19:25:52.004406 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.004383 2568 generic.go:358] "Generic (PLEG): container finished" podID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerID="8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1" exitCode=0 Apr 24 19:25:52.004505 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.004449 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" event={"ID":"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86","Type":"ContainerDied","Data":"8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1"} Apr 24 19:25:52.004505 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.004457 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" Apr 24 19:25:52.004505 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.004472 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b" event={"ID":"a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86","Type":"ContainerDied","Data":"1940c777f1ac08b77df76025b18231250595d24b0c0d8aa8662b7c5de3634d68"} Apr 24 19:25:52.004505 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.004487 2568 scope.go:117] "RemoveContainer" containerID="094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8" Apr 24 19:25:52.004997 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.004974 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:52.005119 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.005000 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:52.005119 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.005016 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-isvc-primary-b4ed0b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:52.005119 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.005031 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkgm7\" (UniqueName: \"kubernetes.io/projected/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86-kube-api-access-bkgm7\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:52.013296 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.013271 2568 scope.go:117] "RemoveContainer" containerID="8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1" Apr 24 19:25:52.021721 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.021347 2568 scope.go:117] "RemoveContainer" containerID="aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb" Apr 24 19:25:52.031285 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.031259 2568 scope.go:117] "RemoveContainer" containerID="094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8" Apr 24 19:25:52.031580 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:52.031555 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8\": container with ID starting with 094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8 not found: ID does not exist" containerID="094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8" Apr 24 19:25:52.031639 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.031591 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8"} err="failed to get container status \"094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8\": rpc error: code = NotFound desc = could not find container \"094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8\": container with ID starting with 094ad88e3ae844d0db6fbff8221d68f58deac3b267e2ed4ce8ed856937a8e5c8 not found: ID does not exist" Apr 24 19:25:52.031639 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.031610 2568 scope.go:117] "RemoveContainer" containerID="8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1" Apr 24 19:25:52.031908 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:52.031888 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1\": container with ID starting with 8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1 not found: ID does not exist" containerID="8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1" Apr 24 19:25:52.031992 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.031915 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1"} err="failed to get container status \"8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1\": rpc error: code = NotFound desc = could not find container \"8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1\": container with ID starting with 8290848e24de6d65781f0ee65b428befe65f35606bfb4a76ef18d3747d5afae1 not found: ID does not exist" Apr 24 19:25:52.031992 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.031930 2568 scope.go:117] "RemoveContainer" containerID="aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb" Apr 24 19:25:52.032353 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:52.032321 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb\": container with ID starting with aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb not found: ID does not exist" containerID="aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb" Apr 24 19:25:52.032480 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.032363 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb"} err="failed to get container status \"aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb\": rpc error: code = NotFound desc = could not find container \"aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb\": container with ID starting with aa00b04ef1c893d35a7c05ab397fd4f48d983621585069187f61faa4709bc1fb not found: ID does not exist" Apr 24 19:25:52.035345 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.035322 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b"] Apr 24 19:25:52.040722 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.040699 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b4ed0b-predictor-5fdfb76759-zh94b"] Apr 24 19:25:52.226659 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.226575 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x"] Apr 24 19:25:52.347207 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347158 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6"] Apr 24 19:25:52.347820 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347796 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="storage-initializer" Apr 24 19:25:52.347820 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347824 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="storage-initializer" Apr 24 19:25:52.347994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347852 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="633f22e8-63a8-478d-bd5f-44155c31cdc7" containerName="storage-initializer" Apr 24 19:25:52.347994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347863 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="633f22e8-63a8-478d-bd5f-44155c31cdc7" containerName="storage-initializer" Apr 24 19:25:52.347994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347885 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="633f22e8-63a8-478d-bd5f-44155c31cdc7" containerName="storage-initializer" Apr 24 19:25:52.347994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347895 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="633f22e8-63a8-478d-bd5f-44155c31cdc7" containerName="storage-initializer" Apr 24 19:25:52.347994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347904 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" Apr 24 19:25:52.347994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347914 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" Apr 24 19:25:52.347994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347930 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kube-rbac-proxy" Apr 24 19:25:52.347994 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.347939 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kube-rbac-proxy" Apr 24 19:25:52.348344 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.348052 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="633f22e8-63a8-478d-bd5f-44155c31cdc7" containerName="storage-initializer" Apr 24 19:25:52.348344 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.348065 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kube-rbac-proxy" Apr 24 19:25:52.348344 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.348083 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" containerName="kserve-container" Apr 24 19:25:52.348344 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.348295 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="633f22e8-63a8-478d-bd5f-44155c31cdc7" containerName="storage-initializer" Apr 24 19:25:52.352318 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.352286 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.355454 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.355072 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-eaa35-predictor-serving-cert\"" Apr 24 19:25:52.355454 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.355219 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pjv9w\"" Apr 24 19:25:52.355454 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.355083 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-eaa35-kube-rbac-proxy-sar-config\"" Apr 24 19:25:52.359992 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.359963 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6"] Apr 24 19:25:52.409026 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.408990 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-eaa35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/144f8023-7c54-4bff-bb03-2c1bf181a4c6-raw-sklearn-eaa35-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.409026 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.409036 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kserve-provision-location\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.409329 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.409133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/144f8023-7c54-4bff-bb03-2c1bf181a4c6-proxy-tls\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.409329 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.409181 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmfl6\" (UniqueName: \"kubernetes.io/projected/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kube-api-access-tmfl6\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.510010 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.509899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kserve-provision-location\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.510010 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.510000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/144f8023-7c54-4bff-bb03-2c1bf181a4c6-proxy-tls\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.510279 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.510034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmfl6\" (UniqueName: \"kubernetes.io/projected/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kube-api-access-tmfl6\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.510279 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.510097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-eaa35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/144f8023-7c54-4bff-bb03-2c1bf181a4c6-raw-sklearn-eaa35-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.510279 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:52.510165 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-serving-cert: secret "raw-sklearn-eaa35-predictor-serving-cert" not found Apr 24 19:25:52.510279 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:52.510232 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/144f8023-7c54-4bff-bb03-2c1bf181a4c6-proxy-tls podName:144f8023-7c54-4bff-bb03-2c1bf181a4c6 nodeName:}" failed. No retries permitted until 2026-04-24 19:25:53.010212105 +0000 UTC m=+1156.122048435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/144f8023-7c54-4bff-bb03-2c1bf181a4c6-proxy-tls") pod "raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" (UID: "144f8023-7c54-4bff-bb03-2c1bf181a4c6") : secret "raw-sklearn-eaa35-predictor-serving-cert" not found Apr 24 19:25:52.510664 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.510644 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kserve-provision-location\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.510922 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.510898 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-eaa35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/144f8023-7c54-4bff-bb03-2c1bf181a4c6-raw-sklearn-eaa35-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:52.522065 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:52.522028 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmfl6\" (UniqueName: \"kubernetes.io/projected/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kube-api-access-tmfl6\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:53.010036 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:53.010005 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x_be8bc7bd-935c-4317-9d0c-acdbc9d2f5de/storage-initializer/0.log" Apr 24 19:25:53.010506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:53.010147 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" event={"ID":"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de","Type":"ContainerStarted","Data":"925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91"} Apr 24 19:25:53.010506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:53.010326 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" podUID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" containerName="storage-initializer" containerID="cri-o://925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91" gracePeriod=30 Apr 24 19:25:53.014947 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:53.014921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/144f8023-7c54-4bff-bb03-2c1bf181a4c6-proxy-tls\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:53.017292 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:53.017274 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/144f8023-7c54-4bff-bb03-2c1bf181a4c6-proxy-tls\") pod \"raw-sklearn-eaa35-predictor-5f65c47b68-v94x6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:53.268703 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:53.268590 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:53.396552 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:53.396521 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6"] Apr 24 19:25:53.398254 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:25:53.398224 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod144f8023_7c54_4bff_bb03_2c1bf181a4c6.slice/crio-fa3e04a43173bfca7653b76a374daa043e728b9c6c2c98a6ee8187f51b479e3a WatchSource:0}: Error finding container fa3e04a43173bfca7653b76a374daa043e728b9c6c2c98a6ee8187f51b479e3a: Status 404 returned error can't find the container with id fa3e04a43173bfca7653b76a374daa043e728b9c6c2c98a6ee8187f51b479e3a Apr 24 19:25:53.459646 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:53.459617 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86" path="/var/lib/kubelet/pods/a60bdf07-5eab-4ed3-9e8a-3f0f42af0a86/volumes" Apr 24 19:25:54.016841 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:54.016803 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" event={"ID":"144f8023-7c54-4bff-bb03-2c1bf181a4c6","Type":"ContainerStarted","Data":"cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481"} Apr 24 19:25:54.016841 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:54.016841 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" event={"ID":"144f8023-7c54-4bff-bb03-2c1bf181a4c6","Type":"ContainerStarted","Data":"fa3e04a43173bfca7653b76a374daa043e728b9c6c2c98a6ee8187f51b479e3a"} Apr 24 19:25:57.560218 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.560191 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x_be8bc7bd-935c-4317-9d0c-acdbc9d2f5de/storage-initializer/1.log" Apr 24 19:25:57.560615 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.560542 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x_be8bc7bd-935c-4317-9d0c-acdbc9d2f5de/storage-initializer/0.log" Apr 24 19:25:57.560615 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.560608 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:57.655878 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.655840 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kserve-provision-location\") pod \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " Apr 24 19:25:57.656061 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.655929 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\") pod \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " Apr 24 19:25:57.656061 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.655986 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-proxy-tls\") pod \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " Apr 24 19:25:57.656061 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.656041 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zqcn\" (UniqueName: \"kubernetes.io/projected/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kube-api-access-8zqcn\") pod \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " Apr 24 19:25:57.656286 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.656070 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-cabundle-cert\") pod \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\" (UID: \"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de\") " Apr 24 19:25:57.656286 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.656173 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" (UID: "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:25:57.656397 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.656339 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config") pod "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" (UID: "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de"). InnerVolumeSpecName "isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:57.656469 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.656439 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:57.656469 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.656458 2568 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-isvc-init-fail-bf4c24-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:57.656576 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.656537 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" (UID: "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:57.658430 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.658403 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kube-api-access-8zqcn" (OuterVolumeSpecName: "kube-api-access-8zqcn") pod "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" (UID: "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de"). InnerVolumeSpecName "kube-api-access-8zqcn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:25:57.658562 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.658479 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" (UID: "be8bc7bd-935c-4317-9d0c-acdbc9d2f5de"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:25:57.757189 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.757152 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:57.757189 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.757186 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8zqcn\" (UniqueName: \"kubernetes.io/projected/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-kube-api-access-8zqcn\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:57.757189 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:57.757197 2568 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de-cabundle-cert\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:25:58.031446 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.031417 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x_be8bc7bd-935c-4317-9d0c-acdbc9d2f5de/storage-initializer/1.log" Apr 24 19:25:58.031836 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.031819 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x_be8bc7bd-935c-4317-9d0c-acdbc9d2f5de/storage-initializer/0.log" Apr 24 19:25:58.031899 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.031855 2568 generic.go:358] "Generic (PLEG): container finished" podID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" containerID="925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91" exitCode=1 Apr 24 19:25:58.031955 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.031928 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" event={"ID":"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de","Type":"ContainerDied","Data":"925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91"} Apr 24 19:25:58.031999 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.031974 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" event={"ID":"be8bc7bd-935c-4317-9d0c-acdbc9d2f5de","Type":"ContainerDied","Data":"3778c5c13e80468b5f35558def2f0ffb38246ca3ff28f081b2f18e3532756693"} Apr 24 19:25:58.031999 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.031976 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x" Apr 24 19:25:58.031999 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.031991 2568 scope.go:117] "RemoveContainer" containerID="925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91" Apr 24 19:25:58.033542 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.033517 2568 generic.go:358] "Generic (PLEG): container finished" podID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerID="cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481" exitCode=0 Apr 24 19:25:58.033632 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.033554 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" event={"ID":"144f8023-7c54-4bff-bb03-2c1bf181a4c6","Type":"ContainerDied","Data":"cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481"} Apr 24 19:25:58.041896 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.041875 2568 scope.go:117] "RemoveContainer" containerID="4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a" Apr 24 19:25:58.050637 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.050614 2568 scope.go:117] "RemoveContainer" containerID="925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91" Apr 24 19:25:58.050900 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:58.050877 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91\": container with ID starting with 925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91 not found: ID does not exist" containerID="925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91" Apr 24 19:25:58.050970 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.050907 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91"} err="failed to get container status \"925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91\": rpc error: code = NotFound desc = could not find container \"925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91\": container with ID starting with 925c37b4a60521f36dca42c4f97e8a6bea541105b4dcec6a8495797324833f91 not found: ID does not exist" Apr 24 19:25:58.050970 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.050925 2568 scope.go:117] "RemoveContainer" containerID="4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a" Apr 24 19:25:58.051204 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:25:58.051186 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a\": container with ID starting with 4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a not found: ID does not exist" containerID="4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a" Apr 24 19:25:58.051260 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.051210 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a"} err="failed to get container status \"4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a\": rpc error: code = NotFound desc = could not find container \"4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a\": container with ID starting with 4eb795a7d60dab929c1a57831c557e64d2bc87a2aba1b9a767e0f16ba0adeb3a not found: ID does not exist" Apr 24 19:25:58.085884 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.085846 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x"] Apr 24 19:25:58.091633 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:58.091603 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bf4c24-predictor-7d468db94-rhw7x"] Apr 24 19:25:59.039723 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:59.039685 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" event={"ID":"144f8023-7c54-4bff-bb03-2c1bf181a4c6","Type":"ContainerStarted","Data":"7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e"} Apr 24 19:25:59.040373 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:59.039737 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" event={"ID":"144f8023-7c54-4bff-bb03-2c1bf181a4c6","Type":"ContainerStarted","Data":"ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba"} Apr 24 19:25:59.040373 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:59.040000 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:25:59.060954 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:59.060904 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podStartSLOduration=7.060890401 podStartE2EDuration="7.060890401s" podCreationTimestamp="2026-04-24 19:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:25:59.0602947 +0000 UTC m=+1162.172131059" watchObservedRunningTime="2026-04-24 19:25:59.060890401 +0000 UTC m=+1162.172726747" Apr 24 19:25:59.459616 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:25:59.459575 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" path="/var/lib/kubelet/pods/be8bc7bd-935c-4317-9d0c-acdbc9d2f5de/volumes" Apr 24 19:26:00.044629 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:00.044593 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:26:00.045962 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:00.045930 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:26:01.048812 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:01.048774 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:26:06.052861 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:06.052827 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:26:06.053458 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:06.053428 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:26:16.054138 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:16.054070 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:26:26.053727 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:26.053683 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:26:36.053562 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:36.053519 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:26:37.444490 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:37.444454 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:26:37.448506 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:37.448478 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:26:37.449010 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:37.448992 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:26:37.452460 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:37.452441 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:26:46.053556 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:46.053511 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:26:56.053368 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:26:56.053329 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:27:06.054460 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:06.054430 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:27:12.470261 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.470227 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6"] Apr 24 19:27:12.470741 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.470549 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" containerID="cri-o://ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba" gracePeriod=30 Apr 24 19:27:12.470741 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.470584 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kube-rbac-proxy" containerID="cri-o://7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e" gracePeriod=30 Apr 24 19:27:12.570027 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.569987 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8"] Apr 24 19:27:12.570413 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.570398 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" containerName="storage-initializer" Apr 24 19:27:12.570413 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.570414 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" containerName="storage-initializer" Apr 24 19:27:12.570556 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.570516 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" containerName="storage-initializer" Apr 24 19:27:12.570556 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.570527 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" containerName="storage-initializer" Apr 24 19:27:12.570623 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.570596 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" containerName="storage-initializer" Apr 24 19:27:12.570623 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.570602 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="be8bc7bd-935c-4317-9d0c-acdbc9d2f5de" containerName="storage-initializer" Apr 24 19:27:12.574009 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.573989 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.576696 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.576672 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-63372-predictor-serving-cert\"" Apr 24 19:27:12.576827 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.576677 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\"" Apr 24 19:27:12.584942 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.584916 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8"] Apr 24 19:27:12.753123 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.753011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bb62ecb-cff8-41d5-aafb-38182389d863-kserve-provision-location\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.753288 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.753130 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4bb62ecb-cff8-41d5-aafb-38182389d863-raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.753288 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.753187 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn7fc\" (UniqueName: \"kubernetes.io/projected/4bb62ecb-cff8-41d5-aafb-38182389d863-kube-api-access-zn7fc\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.753288 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.753227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bb62ecb-cff8-41d5-aafb-38182389d863-proxy-tls\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.854555 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.854518 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4bb62ecb-cff8-41d5-aafb-38182389d863-raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.854734 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.854567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn7fc\" (UniqueName: \"kubernetes.io/projected/4bb62ecb-cff8-41d5-aafb-38182389d863-kube-api-access-zn7fc\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.854734 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.854597 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bb62ecb-cff8-41d5-aafb-38182389d863-proxy-tls\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.854734 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.854634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bb62ecb-cff8-41d5-aafb-38182389d863-kserve-provision-location\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.855164 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.855139 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bb62ecb-cff8-41d5-aafb-38182389d863-kserve-provision-location\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.855395 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.855365 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4bb62ecb-cff8-41d5-aafb-38182389d863-raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.857151 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.857129 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bb62ecb-cff8-41d5-aafb-38182389d863-proxy-tls\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.863853 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.863831 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn7fc\" (UniqueName: \"kubernetes.io/projected/4bb62ecb-cff8-41d5-aafb-38182389d863-kube-api-access-zn7fc\") pod \"raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:12.886839 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:12.886800 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:13.045388 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:13.045298 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8"] Apr 24 19:27:13.049405 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:27:13.049372 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bb62ecb_cff8_41d5_aafb_38182389d863.slice/crio-7b6705aa8027d30a7b8322f377a0fe573f42e8a63c56f4498b68bb839b9a60b2 WatchSource:0}: Error finding container 7b6705aa8027d30a7b8322f377a0fe573f42e8a63c56f4498b68bb839b9a60b2: Status 404 returned error can't find the container with id 7b6705aa8027d30a7b8322f377a0fe573f42e8a63c56f4498b68bb839b9a60b2 Apr 24 19:27:13.321603 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:13.321494 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" event={"ID":"4bb62ecb-cff8-41d5-aafb-38182389d863","Type":"ContainerStarted","Data":"358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950"} Apr 24 19:27:13.321603 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:13.321542 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" event={"ID":"4bb62ecb-cff8-41d5-aafb-38182389d863","Type":"ContainerStarted","Data":"7b6705aa8027d30a7b8322f377a0fe573f42e8a63c56f4498b68bb839b9a60b2"} Apr 24 19:27:13.323495 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:13.323462 2568 generic.go:358] "Generic (PLEG): container finished" podID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerID="7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e" exitCode=2 Apr 24 19:27:13.323626 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:13.323533 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" event={"ID":"144f8023-7c54-4bff-bb03-2c1bf181a4c6","Type":"ContainerDied","Data":"7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e"} Apr 24 19:27:16.049429 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:16.049385 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.50:8643/healthz\": dial tcp 10.132.0.50:8643: connect: connection refused" Apr 24 19:27:16.054023 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:16.053994 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:27:17.027346 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.027320 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:27:17.099310 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.099282 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-eaa35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/144f8023-7c54-4bff-bb03-2c1bf181a4c6-raw-sklearn-eaa35-kube-rbac-proxy-sar-config\") pod \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " Apr 24 19:27:17.099879 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.099341 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/144f8023-7c54-4bff-bb03-2c1bf181a4c6-proxy-tls\") pod \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " Apr 24 19:27:17.099879 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.099381 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmfl6\" (UniqueName: \"kubernetes.io/projected/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kube-api-access-tmfl6\") pod \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " Apr 24 19:27:17.099879 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.099469 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kserve-provision-location\") pod \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\" (UID: \"144f8023-7c54-4bff-bb03-2c1bf181a4c6\") " Apr 24 19:27:17.099879 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.099739 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144f8023-7c54-4bff-bb03-2c1bf181a4c6-raw-sklearn-eaa35-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-eaa35-kube-rbac-proxy-sar-config") pod "144f8023-7c54-4bff-bb03-2c1bf181a4c6" (UID: "144f8023-7c54-4bff-bb03-2c1bf181a4c6"). InnerVolumeSpecName "raw-sklearn-eaa35-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:27:17.099879 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.099775 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "144f8023-7c54-4bff-bb03-2c1bf181a4c6" (UID: "144f8023-7c54-4bff-bb03-2c1bf181a4c6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:27:17.101449 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.101429 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144f8023-7c54-4bff-bb03-2c1bf181a4c6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "144f8023-7c54-4bff-bb03-2c1bf181a4c6" (UID: "144f8023-7c54-4bff-bb03-2c1bf181a4c6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:27:17.101505 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.101449 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kube-api-access-tmfl6" (OuterVolumeSpecName: "kube-api-access-tmfl6") pod "144f8023-7c54-4bff-bb03-2c1bf181a4c6" (UID: "144f8023-7c54-4bff-bb03-2c1bf181a4c6"). InnerVolumeSpecName "kube-api-access-tmfl6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:27:17.200354 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.200318 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/144f8023-7c54-4bff-bb03-2c1bf181a4c6-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:27:17.200354 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.200348 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmfl6\" (UniqueName: \"kubernetes.io/projected/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kube-api-access-tmfl6\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:27:17.200354 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.200358 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/144f8023-7c54-4bff-bb03-2c1bf181a4c6-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:27:17.200600 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.200369 2568 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-eaa35-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/144f8023-7c54-4bff-bb03-2c1bf181a4c6-raw-sklearn-eaa35-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:27:17.342874 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.342840 2568 generic.go:358] "Generic (PLEG): container finished" podID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerID="ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba" exitCode=0 Apr 24 19:27:17.343075 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.342918 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" Apr 24 19:27:17.343075 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.342918 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" event={"ID":"144f8023-7c54-4bff-bb03-2c1bf181a4c6","Type":"ContainerDied","Data":"ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba"} Apr 24 19:27:17.343075 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.342960 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6" event={"ID":"144f8023-7c54-4bff-bb03-2c1bf181a4c6","Type":"ContainerDied","Data":"fa3e04a43173bfca7653b76a374daa043e728b9c6c2c98a6ee8187f51b479e3a"} Apr 24 19:27:17.343075 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.342983 2568 scope.go:117] "RemoveContainer" containerID="7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e" Apr 24 19:27:17.344382 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.344363 2568 generic.go:358] "Generic (PLEG): container finished" podID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerID="358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950" exitCode=0 Apr 24 19:27:17.344475 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.344436 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" event={"ID":"4bb62ecb-cff8-41d5-aafb-38182389d863","Type":"ContainerDied","Data":"358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950"} Apr 24 19:27:17.352671 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.352650 2568 scope.go:117] "RemoveContainer" containerID="ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba" Apr 24 19:27:17.360304 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.360284 2568 scope.go:117] "RemoveContainer" containerID="cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481" Apr 24 19:27:17.369539 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.369513 2568 scope.go:117] "RemoveContainer" containerID="7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e" Apr 24 19:27:17.369871 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:27:17.369845 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e\": container with ID starting with 7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e not found: ID does not exist" containerID="7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e" Apr 24 19:27:17.369961 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.369880 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e"} err="failed to get container status \"7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e\": rpc error: code = NotFound desc = could not find container \"7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e\": container with ID starting with 7bb0d8fc3afa7afeeaded0933c0293b48182017d022eab440ae641765daf936e not found: ID does not exist" Apr 24 19:27:17.369961 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.369901 2568 scope.go:117] "RemoveContainer" containerID="ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba" Apr 24 19:27:17.370225 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:27:17.370201 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba\": container with ID starting with ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba not found: ID does not exist" containerID="ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba" Apr 24 19:27:17.370317 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.370240 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba"} err="failed to get container status \"ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba\": rpc error: code = NotFound desc = could not find container \"ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba\": container with ID starting with ae3cfa12759c714c551669037d9de07d962d0fd2a18e41b94665012eb1174bba not found: ID does not exist" Apr 24 19:27:17.370317 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.370265 2568 scope.go:117] "RemoveContainer" containerID="cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481" Apr 24 19:27:17.370542 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:27:17.370525 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481\": container with ID starting with cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481 not found: ID does not exist" containerID="cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481" Apr 24 19:27:17.370586 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.370549 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481"} err="failed to get container status \"cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481\": rpc error: code = NotFound desc = could not find container \"cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481\": container with ID starting with cce054963ea0c8dcf949bda4d0b8d8e6c089c88d812c0bcdd76abe70678b5481 not found: ID does not exist" Apr 24 19:27:17.383078 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.383049 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6"] Apr 24 19:27:17.386532 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.386499 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eaa35-predictor-5f65c47b68-v94x6"] Apr 24 19:27:17.460369 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:17.460339 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" path="/var/lib/kubelet/pods/144f8023-7c54-4bff-bb03-2c1bf181a4c6/volumes" Apr 24 19:27:18.350560 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:18.350521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" event={"ID":"4bb62ecb-cff8-41d5-aafb-38182389d863","Type":"ContainerStarted","Data":"8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab"} Apr 24 19:27:18.350560 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:18.350567 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" event={"ID":"4bb62ecb-cff8-41d5-aafb-38182389d863","Type":"ContainerStarted","Data":"66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4"} Apr 24 19:27:18.351115 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:18.350789 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:18.370598 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:18.370539 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podStartSLOduration=6.3705214980000004 podStartE2EDuration="6.370521498s" podCreationTimestamp="2026-04-24 19:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:27:18.368774294 +0000 UTC m=+1241.480610650" watchObservedRunningTime="2026-04-24 19:27:18.370521498 +0000 UTC m=+1241.482357845" Apr 24 19:27:19.354684 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:19.354644 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:19.356143 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:19.356087 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:27:20.357746 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:20.357701 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:27:25.362130 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:25.362020 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:27:25.364547 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:25.362646 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:27:35.362732 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:35.362685 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:27:45.363271 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:45.363218 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:27:55.362583 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:27:55.362541 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:28:05.362975 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:05.362928 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:28:15.363559 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:15.363510 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:28:25.364003 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:25.363971 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:28:32.649848 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:32.649809 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8"] Apr 24 19:28:32.650386 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:32.650146 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" containerID="cri-o://66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4" gracePeriod=30 Apr 24 19:28:32.650386 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:32.650205 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kube-rbac-proxy" containerID="cri-o://8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab" gracePeriod=30 Apr 24 19:28:33.614887 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:33.614845 2568 generic.go:358] "Generic (PLEG): container finished" podID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerID="8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab" exitCode=2 Apr 24 19:28:33.615054 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:33.614910 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" event={"ID":"4bb62ecb-cff8-41d5-aafb-38182389d863","Type":"ContainerDied","Data":"8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab"} Apr 24 19:28:35.358519 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:35.358476 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.51:8643/healthz\": dial tcp 10.132.0.51:8643: connect: connection refused" Apr 24 19:28:35.362756 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:35.362728 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:28:37.103593 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.103566 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:28:37.235180 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.235046 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4bb62ecb-cff8-41d5-aafb-38182389d863-raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\") pod \"4bb62ecb-cff8-41d5-aafb-38182389d863\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " Apr 24 19:28:37.235180 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.235128 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bb62ecb-cff8-41d5-aafb-38182389d863-kserve-provision-location\") pod \"4bb62ecb-cff8-41d5-aafb-38182389d863\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " Apr 24 19:28:37.235426 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.235196 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bb62ecb-cff8-41d5-aafb-38182389d863-proxy-tls\") pod \"4bb62ecb-cff8-41d5-aafb-38182389d863\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " Apr 24 19:28:37.235426 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.235234 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn7fc\" (UniqueName: \"kubernetes.io/projected/4bb62ecb-cff8-41d5-aafb-38182389d863-kube-api-access-zn7fc\") pod \"4bb62ecb-cff8-41d5-aafb-38182389d863\" (UID: \"4bb62ecb-cff8-41d5-aafb-38182389d863\") " Apr 24 19:28:37.235531 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.235445 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb62ecb-cff8-41d5-aafb-38182389d863-raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config") pod "4bb62ecb-cff8-41d5-aafb-38182389d863" (UID: "4bb62ecb-cff8-41d5-aafb-38182389d863"). InnerVolumeSpecName "raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:28:37.235531 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.235449 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bb62ecb-cff8-41d5-aafb-38182389d863-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4bb62ecb-cff8-41d5-aafb-38182389d863" (UID: "4bb62ecb-cff8-41d5-aafb-38182389d863"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:28:37.235625 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.235612 2568 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4bb62ecb-cff8-41d5-aafb-38182389d863-raw-sklearn-runtime-63372-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:28:37.235664 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.235629 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bb62ecb-cff8-41d5-aafb-38182389d863-kserve-provision-location\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:28:37.237406 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.237372 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb62ecb-cff8-41d5-aafb-38182389d863-kube-api-access-zn7fc" (OuterVolumeSpecName: "kube-api-access-zn7fc") pod "4bb62ecb-cff8-41d5-aafb-38182389d863" (UID: "4bb62ecb-cff8-41d5-aafb-38182389d863"). InnerVolumeSpecName "kube-api-access-zn7fc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:28:37.237515 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.237434 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb62ecb-cff8-41d5-aafb-38182389d863-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4bb62ecb-cff8-41d5-aafb-38182389d863" (UID: "4bb62ecb-cff8-41d5-aafb-38182389d863"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:28:37.337049 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.337008 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bb62ecb-cff8-41d5-aafb-38182389d863-proxy-tls\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:28:37.337049 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.337043 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zn7fc\" (UniqueName: \"kubernetes.io/projected/4bb62ecb-cff8-41d5-aafb-38182389d863-kube-api-access-zn7fc\") on node \"ip-10-0-138-52.ec2.internal\" DevicePath \"\"" Apr 24 19:28:37.630314 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.630278 2568 generic.go:358] "Generic (PLEG): container finished" podID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerID="66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4" exitCode=0 Apr 24 19:28:37.630482 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.630331 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" event={"ID":"4bb62ecb-cff8-41d5-aafb-38182389d863","Type":"ContainerDied","Data":"66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4"} Apr 24 19:28:37.630482 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.630360 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" event={"ID":"4bb62ecb-cff8-41d5-aafb-38182389d863","Type":"ContainerDied","Data":"7b6705aa8027d30a7b8322f377a0fe573f42e8a63c56f4498b68bb839b9a60b2"} Apr 24 19:28:37.630482 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.630376 2568 scope.go:117] "RemoveContainer" containerID="8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab" Apr 24 19:28:37.630482 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.630379 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8" Apr 24 19:28:37.639482 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.639461 2568 scope.go:117] "RemoveContainer" containerID="66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4" Apr 24 19:28:37.648914 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.648890 2568 scope.go:117] "RemoveContainer" containerID="358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950" Apr 24 19:28:37.650520 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.650497 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8"] Apr 24 19:28:37.656384 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.656357 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-63372-predictor-5c59d6df78-zvnw8"] Apr 24 19:28:37.657337 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.657313 2568 scope.go:117] "RemoveContainer" containerID="8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab" Apr 24 19:28:37.657618 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:28:37.657599 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab\": container with ID starting with 8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab not found: ID does not exist" containerID="8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab" Apr 24 19:28:37.657687 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.657629 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab"} err="failed to get container status \"8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab\": rpc error: code = NotFound desc = could not find container \"8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab\": container with ID starting with 8525bddeebd09479c022f43f3517320828e206e48e38839fad17b7eda587afab not found: ID does not exist" Apr 24 19:28:37.657687 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.657647 2568 scope.go:117] "RemoveContainer" containerID="66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4" Apr 24 19:28:37.657877 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:28:37.657861 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4\": container with ID starting with 66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4 not found: ID does not exist" containerID="66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4" Apr 24 19:28:37.657929 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.657883 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4"} err="failed to get container status \"66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4\": rpc error: code = NotFound desc = could not find container \"66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4\": container with ID starting with 66fef5c6637cc6d1771103c99a31e12153884358930741f770c86e21cba648a4 not found: ID does not exist" Apr 24 19:28:37.657929 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.657897 2568 scope.go:117] "RemoveContainer" containerID="358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950" Apr 24 19:28:37.658093 ip-10-0-138-52 kubenswrapper[2568]: E0424 19:28:37.658078 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950\": container with ID starting with 358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950 not found: ID does not exist" containerID="358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950" Apr 24 19:28:37.658184 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:37.658121 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950"} err="failed to get container status \"358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950\": rpc error: code = NotFound desc = could not find container \"358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950\": container with ID starting with 358f1d579b6d9c87f808147d9f97536de78d96a9c7cb6bc37e9aca29b9f75950 not found: ID does not exist" Apr 24 19:28:39.459456 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:39.459423 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" path="/var/lib/kubelet/pods/4bb62ecb-cff8-41d5-aafb-38182389d863/volumes" Apr 24 19:28:58.266670 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.266588 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jzlrb/must-gather-4kpdn"] Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267026 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kube-rbac-proxy" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267037 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kube-rbac-proxy" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267047 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="storage-initializer" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267052 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="storage-initializer" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267067 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="storage-initializer" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267073 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="storage-initializer" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267083 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kube-rbac-proxy" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267088 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kube-rbac-proxy" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267095 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267116 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267126 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" Apr 24 19:28:58.267147 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267134 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" Apr 24 19:28:58.267543 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267209 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kube-rbac-proxy" Apr 24 19:28:58.267543 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267219 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bb62ecb-cff8-41d5-aafb-38182389d863" containerName="kserve-container" Apr 24 19:28:58.267543 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267228 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kube-rbac-proxy" Apr 24 19:28:58.267543 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.267236 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="144f8023-7c54-4bff-bb03-2c1bf181a4c6" containerName="kserve-container" Apr 24 19:28:58.272069 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.272046 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jzlrb/must-gather-4kpdn" Apr 24 19:28:58.275134 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.275083 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jzlrb\"/\"kube-root-ca.crt\"" Apr 24 19:28:58.276171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.276145 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jzlrb\"/\"default-dockercfg-df8sl\"" Apr 24 19:28:58.276642 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.276478 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jzlrb\"/\"openshift-service-ca.crt\"" Apr 24 19:28:58.276963 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.276939 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jzlrb/must-gather-4kpdn"] Apr 24 19:28:58.321487 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.321439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/36a3378a-8f1d-4197-b028-2672e3484f9e-must-gather-output\") pod \"must-gather-4kpdn\" (UID: \"36a3378a-8f1d-4197-b028-2672e3484f9e\") " pod="openshift-must-gather-jzlrb/must-gather-4kpdn" Apr 24 19:28:58.321692 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.321592 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j86z6\" (UniqueName: \"kubernetes.io/projected/36a3378a-8f1d-4197-b028-2672e3484f9e-kube-api-access-j86z6\") pod \"must-gather-4kpdn\" (UID: \"36a3378a-8f1d-4197-b028-2672e3484f9e\") " pod="openshift-must-gather-jzlrb/must-gather-4kpdn" Apr 24 19:28:58.422429 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.422387 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j86z6\" (UniqueName: \"kubernetes.io/projected/36a3378a-8f1d-4197-b028-2672e3484f9e-kube-api-access-j86z6\") pod \"must-gather-4kpdn\" (UID: \"36a3378a-8f1d-4197-b028-2672e3484f9e\") " pod="openshift-must-gather-jzlrb/must-gather-4kpdn" Apr 24 19:28:58.422623 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.422447 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/36a3378a-8f1d-4197-b028-2672e3484f9e-must-gather-output\") pod \"must-gather-4kpdn\" (UID: \"36a3378a-8f1d-4197-b028-2672e3484f9e\") " pod="openshift-must-gather-jzlrb/must-gather-4kpdn" Apr 24 19:28:58.422741 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.422726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/36a3378a-8f1d-4197-b028-2672e3484f9e-must-gather-output\") pod \"must-gather-4kpdn\" (UID: \"36a3378a-8f1d-4197-b028-2672e3484f9e\") " pod="openshift-must-gather-jzlrb/must-gather-4kpdn" Apr 24 19:28:58.431753 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.431729 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j86z6\" (UniqueName: \"kubernetes.io/projected/36a3378a-8f1d-4197-b028-2672e3484f9e-kube-api-access-j86z6\") pod \"must-gather-4kpdn\" (UID: \"36a3378a-8f1d-4197-b028-2672e3484f9e\") " pod="openshift-must-gather-jzlrb/must-gather-4kpdn" Apr 24 19:28:58.596470 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.596413 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jzlrb/must-gather-4kpdn" Apr 24 19:28:58.722601 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:58.722433 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jzlrb/must-gather-4kpdn"] Apr 24 19:28:58.725497 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:28:58.725469 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36a3378a_8f1d_4197_b028_2672e3484f9e.slice/crio-c6cc6094fd23b1daf2bcd99f5df4ee89e28186a63a6dd4be8d80b742442b8a92 WatchSource:0}: Error finding container c6cc6094fd23b1daf2bcd99f5df4ee89e28186a63a6dd4be8d80b742442b8a92: Status 404 returned error can't find the container with id c6cc6094fd23b1daf2bcd99f5df4ee89e28186a63a6dd4be8d80b742442b8a92 Apr 24 19:28:59.715135 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:59.715067 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jzlrb/must-gather-4kpdn" event={"ID":"36a3378a-8f1d-4197-b028-2672e3484f9e","Type":"ContainerStarted","Data":"5e2d8a879ae7721ff5902d1f80102073ab1f69d82794632a37f00008792b5377"} Apr 24 19:28:59.715135 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:28:59.715128 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jzlrb/must-gather-4kpdn" event={"ID":"36a3378a-8f1d-4197-b028-2672e3484f9e","Type":"ContainerStarted","Data":"c6cc6094fd23b1daf2bcd99f5df4ee89e28186a63a6dd4be8d80b742442b8a92"} Apr 24 19:29:00.720464 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:00.720426 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jzlrb/must-gather-4kpdn" event={"ID":"36a3378a-8f1d-4197-b028-2672e3484f9e","Type":"ContainerStarted","Data":"79b0a9d2418220c99a0837a67411bb1824b0115e6015dc663322468c62233407"} Apr 24 19:29:00.738749 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:00.738685 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jzlrb/must-gather-4kpdn" podStartSLOduration=1.921155618 podStartE2EDuration="2.738666641s" podCreationTimestamp="2026-04-24 19:28:58 +0000 UTC" firstStartedPulling="2026-04-24 19:28:58.727272829 +0000 UTC m=+1341.839109156" lastFinishedPulling="2026-04-24 19:28:59.544783851 +0000 UTC m=+1342.656620179" observedRunningTime="2026-04-24 19:29:00.736213286 +0000 UTC m=+1343.848049633" watchObservedRunningTime="2026-04-24 19:29:00.738666641 +0000 UTC m=+1343.850502989" Apr 24 19:29:00.921083 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:00.921049 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dns5q_e583cac8-fcbc-4fa3-a2f4-d8b1fad99146/global-pull-secret-syncer/0.log" Apr 24 19:29:01.023468 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:01.023392 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-67l2j_a96d8dd3-8216-48b7-a304-75026c92aa95/konnectivity-agent/0.log" Apr 24 19:29:01.131970 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:01.131936 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-52.ec2.internal_a2e4dc57eaf3ecec9842fb4e7d99fd2d/haproxy/0.log" Apr 24 19:29:04.720241 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:04.720086 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd86bc9c-68eb-4f5a-a4a6-c34d485682b3/alertmanager/0.log" Apr 24 19:29:04.742448 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:04.742412 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd86bc9c-68eb-4f5a-a4a6-c34d485682b3/config-reloader/0.log" Apr 24 19:29:04.767847 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:04.767759 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd86bc9c-68eb-4f5a-a4a6-c34d485682b3/kube-rbac-proxy-web/0.log" Apr 24 19:29:04.797516 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:04.797447 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd86bc9c-68eb-4f5a-a4a6-c34d485682b3/kube-rbac-proxy/0.log" Apr 24 19:29:04.818689 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:04.818650 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd86bc9c-68eb-4f5a-a4a6-c34d485682b3/kube-rbac-proxy-metric/0.log" Apr 24 19:29:04.840630 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:04.840590 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd86bc9c-68eb-4f5a-a4a6-c34d485682b3/prom-label-proxy/0.log" Apr 24 19:29:04.863776 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:04.863744 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd86bc9c-68eb-4f5a-a4a6-c34d485682b3/init-config-reloader/0.log" Apr 24 19:29:04.905752 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:04.905713 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-9lv7p_7503386e-4776-4896-bd44-4ef455ac6b98/cluster-monitoring-operator/0.log" Apr 24 19:29:04.994265 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:04.994190 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-58584969c7-cvcpc_ee37efac-17ac-4808-b8db-0df62be52e08/metrics-server/0.log" Apr 24 19:29:05.015738 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.015707 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4kktj_6dd835bc-4aa2-4709-9346-da69bea29d70/monitoring-plugin/0.log" Apr 24 19:29:05.110356 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.110308 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f8rpr_d5bc99d3-816b-40e4-958b-a40410565822/node-exporter/0.log" Apr 24 19:29:05.128866 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.128834 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f8rpr_d5bc99d3-816b-40e4-958b-a40410565822/kube-rbac-proxy/0.log" Apr 24 19:29:05.150082 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.150046 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f8rpr_d5bc99d3-816b-40e4-958b-a40410565822/init-textfile/0.log" Apr 24 19:29:05.237381 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.237299 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kq56v_ed502cab-a65d-4d95-91d4-aa59376937de/kube-rbac-proxy-main/0.log" Apr 24 19:29:05.256139 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.255950 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kq56v_ed502cab-a65d-4d95-91d4-aa59376937de/kube-rbac-proxy-self/0.log" Apr 24 19:29:05.276264 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.276232 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kq56v_ed502cab-a65d-4d95-91d4-aa59376937de/openshift-state-metrics/0.log" Apr 24 19:29:05.481262 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.481227 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-g6579_7b502bbd-c353-478c-a6e6-215d2dd5c38b/prometheus-operator-admission-webhook/0.log" Apr 24 19:29:05.511393 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.511307 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64769d95dd-hxrcw_79eea6fb-9659-4b97-86f9-704b91e40d4b/telemeter-client/0.log" Apr 24 19:29:05.531722 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.531688 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64769d95dd-hxrcw_79eea6fb-9659-4b97-86f9-704b91e40d4b/reload/0.log" Apr 24 19:29:05.555757 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.555727 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64769d95dd-hxrcw_79eea6fb-9659-4b97-86f9-704b91e40d4b/kube-rbac-proxy/0.log" Apr 24 19:29:05.586751 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.586721 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67f5cfbc9c-wfwqd_2c03d4b6-bd8b-48dd-8113-7e5008c145c1/thanos-query/0.log" Apr 24 19:29:05.606074 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.606042 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67f5cfbc9c-wfwqd_2c03d4b6-bd8b-48dd-8113-7e5008c145c1/kube-rbac-proxy-web/0.log" Apr 24 19:29:05.641663 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.641612 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67f5cfbc9c-wfwqd_2c03d4b6-bd8b-48dd-8113-7e5008c145c1/kube-rbac-proxy/0.log" Apr 24 19:29:05.662171 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.662134 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67f5cfbc9c-wfwqd_2c03d4b6-bd8b-48dd-8113-7e5008c145c1/prom-label-proxy/0.log" Apr 24 19:29:05.679670 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.679624 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67f5cfbc9c-wfwqd_2c03d4b6-bd8b-48dd-8113-7e5008c145c1/kube-rbac-proxy-rules/0.log" Apr 24 19:29:05.709193 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:05.709153 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67f5cfbc9c-wfwqd_2c03d4b6-bd8b-48dd-8113-7e5008c145c1/kube-rbac-proxy-metrics/0.log" Apr 24 19:29:06.911874 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:06.911840 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-qxjnt_ef2156a7-e5a4-42ca-8af9-87d90a778914/networking-console-plugin/0.log" Apr 24 19:29:07.314660 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:07.314591 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/1.log" Apr 24 19:29:07.324068 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:07.324035 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zsl4c_f3d37867-8a80-4198-9320-281682c54121/console-operator/2.log" Apr 24 19:29:07.679702 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:07.679671 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9c8765b65-dhrt8_49aa9000-7dea-4c2e-aeae-918f0ad7936b/console/0.log" Apr 24 19:29:07.709056 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:07.709023 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lp9kx_a2da8ca4-ca58-4a9f-b9fd-8ff5e9093163/download-server/0.log" Apr 24 19:29:08.097291 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.097266 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-smbpl_10fdb32e-ef53-491d-922c-4d9e4f2531f0/volume-data-source-validator/0.log" Apr 24 19:29:08.244990 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.244954 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2"] Apr 24 19:29:08.249542 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.249515 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.256990 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.256955 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2"] Apr 24 19:29:08.319242 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.319199 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zqc5\" (UniqueName: \"kubernetes.io/projected/724e1567-c845-4792-a0d3-efcfa305bdc9-kube-api-access-4zqc5\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.319514 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.319489 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-lib-modules\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.319647 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.319633 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-proc\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.319815 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.319800 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-podres\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.319979 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.319964 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-sys\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.421152 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.421036 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-podres\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.421313 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.421152 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-sys\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.421313 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.421213 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zqc5\" (UniqueName: \"kubernetes.io/projected/724e1567-c845-4792-a0d3-efcfa305bdc9-kube-api-access-4zqc5\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.421313 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.421241 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-lib-modules\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.421313 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.421262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-proc\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.421313 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.421259 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-podres\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.421313 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.421279 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-sys\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.421553 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.421331 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-proc\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.421553 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.421476 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/724e1567-c845-4792-a0d3-efcfa305bdc9-lib-modules\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.430629 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.430602 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zqc5\" (UniqueName: \"kubernetes.io/projected/724e1567-c845-4792-a0d3-efcfa305bdc9-kube-api-access-4zqc5\") pod \"perf-node-gather-daemonset-wpsk2\" (UID: \"724e1567-c845-4792-a0d3-efcfa305bdc9\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.562478 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.562435 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:08.703796 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.703767 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2"] Apr 24 19:29:08.706223 ip-10-0-138-52 kubenswrapper[2568]: W0424 19:29:08.706190 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod724e1567_c845_4792_a0d3_efcfa305bdc9.slice/crio-b74aac41e51f0261fc069e37096074600d97b9724038292ece3a8a18003f4fd6 WatchSource:0}: Error finding container b74aac41e51f0261fc069e37096074600d97b9724038292ece3a8a18003f4fd6: Status 404 returned error can't find the container with id b74aac41e51f0261fc069e37096074600d97b9724038292ece3a8a18003f4fd6 Apr 24 19:29:08.762890 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.762846 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" event={"ID":"724e1567-c845-4792-a0d3-efcfa305bdc9","Type":"ContainerStarted","Data":"b74aac41e51f0261fc069e37096074600d97b9724038292ece3a8a18003f4fd6"} Apr 24 19:29:08.853558 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.853538 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qxfsm_85baa476-8a9a-44b6-83c0-0050c6c28921/dns/0.log" Apr 24 19:29:08.883719 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:08.883691 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qxfsm_85baa476-8a9a-44b6-83c0-0050c6c28921/kube-rbac-proxy/0.log" Apr 24 19:29:09.017548 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:09.017469 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r676m_c4b8f1da-e016-4496-a12c-19572f7ba9ad/dns-node-resolver/0.log" Apr 24 19:29:09.473516 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:09.473456 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-29rc7_254ea4ca-f9d7-452a-9868-bdf3ef96512c/node-ca/0.log" Apr 24 19:29:09.769006 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:09.768926 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" event={"ID":"724e1567-c845-4792-a0d3-efcfa305bdc9","Type":"ContainerStarted","Data":"fe561a3fdaab8d60f25b54dd5d2cf9783c1769265730a40966d0f39bdbef1377"} Apr 24 19:29:09.769006 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:09.768974 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:09.787948 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:09.787896 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" podStartSLOduration=1.787881139 podStartE2EDuration="1.787881139s" podCreationTimestamp="2026-04-24 19:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:29:09.78687358 +0000 UTC m=+1352.898709925" watchObservedRunningTime="2026-04-24 19:29:09.787881139 +0000 UTC m=+1352.899717485" Apr 24 19:29:10.261763 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:10.261718 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5dd48cf8b4-xldzq_b7f009e5-c09c-49ca-a4b9-f6dc4bf1ac3e/router/0.log" Apr 24 19:29:10.629535 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:10.629504 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sfgb9_94eee2e4-7d5d-49be-ab39-13cd92cf877f/serve-healthcheck-canary/0.log" Apr 24 19:29:10.959961 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:10.959875 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-bzksz_5d21b7cf-8c3d-459b-a502-f049a7353d9f/insights-operator/0.log" Apr 24 19:29:10.960163 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:10.959985 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-bzksz_5d21b7cf-8c3d-459b-a502-f049a7353d9f/insights-operator/1.log" Apr 24 19:29:11.121431 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:11.121399 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tt4dv_762ba37c-b963-4e59-873c-3dbffed98ff1/kube-rbac-proxy/0.log" Apr 24 19:29:11.139802 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:11.139772 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tt4dv_762ba37c-b963-4e59-873c-3dbffed98ff1/exporter/0.log" Apr 24 19:29:11.157353 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:11.157321 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tt4dv_762ba37c-b963-4e59-873c-3dbffed98ff1/extractor/0.log" Apr 24 19:29:13.110700 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:13.110651 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-8cdbbc8b5-rzvsk_cdfad2de-4a6e-4138-a96d-89aed3ea342f/manager/0.log" Apr 24 19:29:13.147161 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:13.147130 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-khsnm_e9006ada-2f28-45d7-8189-31eb6ecc099e/server/0.log" Apr 24 19:29:13.247282 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:13.247241 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-8pt7p_922f7ac3-263a-4a81-b9e9-3ef5c1024192/manager/0.log" Apr 24 19:29:13.285568 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:13.285541 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-79f457656b-wvpkr_a652e541-bdba-43ae-b7ff-5f89059836aa/seaweedfs/0.log" Apr 24 19:29:15.784117 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:15.784078 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-wpsk2" Apr 24 19:29:17.381003 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:17.380961 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2njp6_3a18605e-85a6-4562-acf8-4bef99990528/kube-storage-version-migrator-operator/1.log" Apr 24 19:29:17.382433 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:17.382397 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2njp6_3a18605e-85a6-4562-acf8-4bef99990528/kube-storage-version-migrator-operator/0.log" Apr 24 19:29:18.480854 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.480823 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gbmct_bc00a2b3-d877-4828-bc79-de040ea70887/kube-multus-additional-cni-plugins/0.log" Apr 24 19:29:18.501891 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.501862 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gbmct_bc00a2b3-d877-4828-bc79-de040ea70887/egress-router-binary-copy/0.log" Apr 24 19:29:18.522286 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.522256 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gbmct_bc00a2b3-d877-4828-bc79-de040ea70887/cni-plugins/0.log" Apr 24 19:29:18.540052 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.540024 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gbmct_bc00a2b3-d877-4828-bc79-de040ea70887/bond-cni-plugin/0.log" Apr 24 19:29:18.557899 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.557877 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gbmct_bc00a2b3-d877-4828-bc79-de040ea70887/routeoverride-cni/0.log" Apr 24 19:29:18.576148 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.576117 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gbmct_bc00a2b3-d877-4828-bc79-de040ea70887/whereabouts-cni-bincopy/0.log" Apr 24 19:29:18.594842 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.594812 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gbmct_bc00a2b3-d877-4828-bc79-de040ea70887/whereabouts-cni/0.log" Apr 24 19:29:18.798582 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.798499 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w4nmn_65e67b5d-323b-47d6-ac53-b3da03f832e6/kube-multus/0.log" Apr 24 19:29:18.929254 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.929220 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nghhh_76172245-47dc-4f2f-90c9-d345a816e233/network-metrics-daemon/0.log" Apr 24 19:29:18.957252 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:18.957199 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nghhh_76172245-47dc-4f2f-90c9-d345a816e233/kube-rbac-proxy/0.log" Apr 24 19:29:19.758951 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:19.758920 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-controller/0.log" Apr 24 19:29:19.777000 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:19.776908 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/0.log" Apr 24 19:29:19.784319 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:19.784293 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovn-acl-logging/1.log" Apr 24 19:29:19.800678 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:19.800644 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/kube-rbac-proxy-node/0.log" Apr 24 19:29:19.817388 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:19.817359 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 19:29:19.843331 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:19.843309 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/northd/0.log" Apr 24 19:29:19.861795 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:19.861765 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/nbdb/0.log" Apr 24 19:29:19.885545 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:19.885512 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/sbdb/0.log" Apr 24 19:29:19.998488 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:19.998449 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2thj7_4a6d24c7-d9ec-4b20-98cd-af5850b0074f/ovnkube-controller/0.log" Apr 24 19:29:21.474234 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:21.474198 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-6gmp9_134e19f5-38b3-4160-8673-d35beeb0ed89/check-endpoints/0.log" Apr 24 19:29:21.535207 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:21.535169 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-nlzd4_22f62d88-7d18-4fc4-a8b1-44efd0814325/network-check-target-container/0.log" Apr 24 19:29:22.375865 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:22.375835 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-x2h2w_4c668390-7023-447b-92b4-e95d0c65f6cd/iptables-alerter/0.log" Apr 24 19:29:23.003385 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:23.003353 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9vthc_2cf109ce-0195-4971-a0e3-86ad92c8ed1f/tuned/0.log" Apr 24 19:29:24.672701 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:24.672627 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2rfkw_8e6a9b64-e53c-4f21-b0b9-61491d1bef6b/cluster-samples-operator/0.log" Apr 24 19:29:24.687819 ip-10-0-138-52 kubenswrapper[2568]: I0424 19:29:24.687792 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2rfkw_8e6a9b64-e53c-4f21-b0b9-61491d1bef6b/cluster-samples-operator-watch/0.log"