Apr 20 20:05:44.771555 ip-10-0-139-59 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:05:45.223687 ip-10-0-139-59 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:45.223687 ip-10-0-139-59 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:05:45.223687 ip-10-0-139-59 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:45.223687 ip-10-0-139-59 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:05:45.223687 ip-10-0-139-59 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:45.226520 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.226433 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:05:45.229596 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229582 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:45.229596 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229597 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229601 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229604 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229607 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229610 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229613 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229616 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229618 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229621 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229624 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229626 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229629 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229631 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229634 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229637 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229639 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229642 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229645 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229651 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:45.229656 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229654 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229656 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229659 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229662 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229665 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229667 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229670 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229672 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229675 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229677 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229680 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229682 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229685 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229687 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229690 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229692 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229694 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229697 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229699 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229702 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229705 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:45.230138 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229707 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229709 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229712 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229714 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229717 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229720 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229722 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229724 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229727 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229730 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229732 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229735 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229737 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229740 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229743 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229745 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229748 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229750 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229753 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:45.230673 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229755 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229759 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229761 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229764 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229767 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229769 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229774 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229778 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229781 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229784 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229787 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229789 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229792 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229794 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229797 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229799 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229802 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229804 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229807 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:45.231129 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229812 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229815 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229818 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229821 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229825 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229828 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.229830 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230218 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230223 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230227 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230230 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230233 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230236 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230238 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230242 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230246 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230249 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230251 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230254 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:45.231597 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230256 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230275 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230277 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230280 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230282 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230285 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230287 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230290 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230293 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230295 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230298 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230300 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230303 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230305 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230308 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230310 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230313 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230315 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230318 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230320 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:45.232054 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230324 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230326 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230329 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230332 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230334 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230336 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230339 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230341 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230346 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230349 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230353 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230355 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230358 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230361 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230363 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230365 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230368 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230370 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230372 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:45.232549 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230375 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230377 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230380 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230382 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230385 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230387 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230389 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230392 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230394 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230397 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230399 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230401 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230404 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230407 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230409 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230412 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230414 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230417 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230419 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230422 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:45.233001 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230424 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230426 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230429 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230431 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230434 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230436 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230439 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230441 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230444 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230446 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230448 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230451 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230454 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230457 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.230459 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230542 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230548 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230554 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230559 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230563 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230566 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:05:45.233500 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230570 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230575 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230579 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230582 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230586 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230589 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230592 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230595 2574 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230598 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230601 2574 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230604 2574 flags.go:64] FLAG: --cloud-config="" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230606 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230609 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230613 2574 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230616 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230619 2574 flags.go:64] FLAG: --config-dir="" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230621 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230625 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230629 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230632 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230635 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230638 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230641 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230644 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:05:45.234018 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230647 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230650 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230652 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230657 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230660 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230663 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230666 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230669 2574 flags.go:64] FLAG: --enable-server="true" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230672 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230676 2574 flags.go:64] FLAG: --event-burst="100" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230680 2574 flags.go:64] FLAG: --event-qps="50" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230683 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230686 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230689 2574 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230694 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230696 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230699 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230702 2574 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230705 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230707 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230710 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230713 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230716 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230719 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230721 2574 flags.go:64] FLAG: --feature-gates="" Apr 20 20:05:45.234608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230725 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230728 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230731 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230734 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230737 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230740 2574 flags.go:64] FLAG: --help="false" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230743 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230746 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230749 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230752 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230755 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230758 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230761 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230764 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230767 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230769 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230772 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230775 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230779 2574 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230781 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230784 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230787 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230792 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230795 2574 flags.go:64] FLAG: --lock-file="" Apr 20 20:05:45.235236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230798 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230800 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230803 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230808 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230811 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230814 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230817 2574 flags.go:64] FLAG: --logging-format="text" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230820 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230823 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230826 2574 flags.go:64] FLAG: --manifest-url="" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230828 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230833 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230836 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230840 2574 flags.go:64] FLAG: --max-pods="110" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230843 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230846 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230848 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230851 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230854 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230857 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230860 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230866 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230869 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230872 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:05:45.235824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230875 2574 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230878 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230883 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230887 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230890 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230893 2574 flags.go:64] FLAG: --port="10250" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230897 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230900 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0352b5e2086e867a5" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230903 2574 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230906 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230909 2574 flags.go:64] FLAG: --register-node="true" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230911 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230914 2574 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230917 2574 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230920 2574 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230923 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230926 2574 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230929 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230932 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230935 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230937 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230941 2574 flags.go:64] FLAG: --runonce="false" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230944 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230947 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230950 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:05:45.236419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230953 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230956 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230959 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230962 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230965 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230968 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230971 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230973 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230976 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230979 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230983 2574 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230986 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230991 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230995 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.230998 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231003 2574 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231005 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231008 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231011 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231014 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231017 2574 flags.go:64] FLAG: --v="2" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231021 2574 flags.go:64] FLAG: --version="false" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231025 2574 flags.go:64] FLAG: --vmodule="" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231034 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.231037 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:05:45.237046 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231130 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231134 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231136 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231139 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231143 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231145 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231148 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231150 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231153 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231156 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231158 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231160 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231163 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231165 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231168 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231170 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231173 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231175 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231178 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231180 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:45.237678 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231184 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231186 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231189 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231192 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231194 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231196 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231199 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231201 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231204 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231206 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231208 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231211 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231213 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231216 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231218 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231220 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231224 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231227 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231229 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:45.238206 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231232 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231234 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231237 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231239 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231241 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231244 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231246 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231248 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231251 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231253 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231271 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231276 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231279 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231283 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231286 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231288 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231291 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231294 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231297 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:45.238693 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231299 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231302 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231305 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231308 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231311 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231313 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231316 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231319 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231321 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231324 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231326 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231329 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231332 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231334 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231337 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231339 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231342 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231345 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231348 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231350 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:45.239169 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231352 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231355 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231357 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231361 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231364 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231366 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231370 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.231373 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.232050 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.239421 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.239437 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239484 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239489 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239494 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239498 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:45.239680 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239502 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239504 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239507 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239510 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239512 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239515 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239517 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239519 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239522 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239525 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239527 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239529 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239532 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239534 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239542 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239544 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239547 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239550 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239553 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239556 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:45.240102 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239558 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239561 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239563 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239566 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239568 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239571 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239573 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239577 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239581 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239583 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239586 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239588 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239591 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239593 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239596 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239598 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239601 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239603 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239606 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:45.240607 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239608 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239610 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239613 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239615 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239618 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239620 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239623 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239625 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239628 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239631 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239633 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239636 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239638 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239641 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239643 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239646 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239648 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239651 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239653 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239655 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:45.241149 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239659 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239662 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239664 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239666 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239669 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239671 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239674 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239676 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239678 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239681 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239684 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239686 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239689 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239692 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239694 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239696 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239699 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239701 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239703 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:45.241738 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239707 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239710 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239713 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239715 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.239720 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239814 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239819 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239823 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239826 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239829 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239832 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239835 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239837 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239840 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239843 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:45.242211 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239846 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239849 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239851 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239854 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239856 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239858 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239861 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239863 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239866 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239869 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239871 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239874 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239876 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239879 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239881 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239884 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239886 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239888 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239891 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239893 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:45.242599 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239896 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239898 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239901 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239903 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239906 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239908 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239910 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239913 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239915 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239917 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239920 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239923 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239926 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239930 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239933 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239935 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239937 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239940 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239943 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239945 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:45.243090 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239948 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239950 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239952 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239955 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239957 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239960 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239962 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239964 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239967 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239969 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239972 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239974 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239976 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239979 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239981 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239984 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239986 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239988 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239991 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239993 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:45.243592 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239995 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.239998 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240000 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240003 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240005 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240008 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240011 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240013 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240016 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240018 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240021 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240032 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240034 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240037 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240040 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:45.244141 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:45.240042 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:45.244521 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.240047 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:45.244521 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.240743 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:05:45.244521 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.242740 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:05:45.244521 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.243623 2574 server.go:1019] "Starting client certificate rotation" Apr 20 20:05:45.244521 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.243728 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:45.244663 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.244618 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:45.270222 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.270202 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:45.275433 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.275252 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:45.294685 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.294662 2574 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:05:45.300640 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.300621 2574 log.go:25] "Validated CRI v1 image API" Apr 20 20:05:45.303004 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.302986 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:05:45.305821 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.305799 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:45.306328 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.306305 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 eb2b88f7-a348-44d8-b262-024b84387742:/dev/nvme0n1p4 f682b90f-68be-48cb-b1a2-d6b39f4f77a1:/dev/nvme0n1p3] Apr 20 20:05:45.306369 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.306330 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:05:45.312145 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.312037 2574 manager.go:217] Machine: {Timestamp:2026-04-20 20:05:45.310747714 +0000 UTC m=+0.419803413 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098593 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec269157e71f4e28e4dd21861a5ee459 SystemUUID:ec269157-e71f-4e28-e4dd-21861a5ee459 BootID:dcfdf7e3-2d7b-4618-9fa4-8b45e0fbd158 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:94:9a:5d:1a:03 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:94:9a:5d:1a:03 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:eb:ef:c7:c7:58 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:05:45.312145 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.312141 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:05:45.312248 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.312219 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:05:45.313936 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.313912 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:05:45.314091 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.313939 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-59.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:05:45.314135 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.314103 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:05:45.314135 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.314112 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:05:45.314135 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.314125 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:45.314135 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.314135 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:45.314991 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.314981 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:45.315097 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.315089 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:05:45.317561 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.317552 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:05:45.317601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.317566 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:05:45.317601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.317581 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:05:45.317601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.317592 2574 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:05:45.317601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.317600 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:05:45.318745 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.318733 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:45.318799 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.318753 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:45.321800 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.321773 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:05:45.323238 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.323221 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:05:45.324992 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.324980 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:05:45.325038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.324997 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:05:45.325038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325003 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:05:45.325038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325008 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:05:45.325038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325013 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:05:45.325038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325019 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:05:45.325038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325025 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:05:45.325038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325030 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:05:45.325038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325036 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:05:45.325240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325042 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:05:45.325240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325066 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:05:45.325240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325075 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:05:45.325986 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325974 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:05:45.325986 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.325988 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:05:45.330131 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.330109 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:05:45.330218 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.330201 2574 server.go:1295] "Started kubelet" Apr 20 20:05:45.330339 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.330312 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:05:45.330399 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.330318 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:05:45.330399 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.330390 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:05:45.331091 ip-10-0-139-59 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:05:45.331755 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.331577 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:05:45.331958 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.331897 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-59.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:05:45.332050 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.332009 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:05:45.332238 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.332199 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-59.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:05:45.332968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.332956 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:05:45.336192 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.336174 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rzv8r" Apr 20 20:05:45.336717 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.336697 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:45.337326 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.337304 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:05:45.337469 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.336389 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-59.ec2.internal.18a82960025118f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-59.ec2.internal,UID:ip-10-0-139-59.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-59.ec2.internal,},FirstTimestamp:2026-04-20 20:05:45.330129143 +0000 UTC m=+0.439184841,LastTimestamp:2026-04-20 20:05:45.330129143 +0000 UTC m=+0.439184841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-59.ec2.internal,}" Apr 20 20:05:45.338227 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.338208 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:05:45.338227 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.338228 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:05:45.338344 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.338334 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:05:45.338392 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.338383 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:05:45.338425 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.338393 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:05:45.338505 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.338490 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:45.339074 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.339056 2574 factory.go:55] Registering systemd factory Apr 20 20:05:45.339158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.339081 2574 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:05:45.339326 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.339310 2574 factory.go:153] Registering CRI-O factory Apr 20 20:05:45.339326 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.339323 2574 factory.go:223] Registration of the crio container factory successfully Apr 20 20:05:45.339467 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.339388 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:05:45.339467 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.339410 2574 factory.go:103] Registering Raw factory Apr 20 20:05:45.339467 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.339423 2574 manager.go:1196] Started watching for new ooms in manager Apr 20 20:05:45.339976 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.339957 2574 manager.go:319] Starting recovery of all containers Apr 20 20:05:45.340989 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.340970 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:05:45.344048 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.344014 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rzv8r" Apr 20 20:05:45.345809 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.345782 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 20:05:45.346103 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.345908 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-59.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 20:05:45.351807 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.351789 2574 manager.go:324] Recovery completed Apr 20 20:05:45.356581 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.356568 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:45.359708 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.359693 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:45.359764 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.359720 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:45.359764 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.359731 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:45.360207 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.360194 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:05:45.360207 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.360205 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:05:45.360311 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.360219 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:45.362478 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.362467 2574 policy_none.go:49] "None policy: Start" Apr 20 20:05:45.362522 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.362482 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:05:45.362522 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.362492 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:05:45.402821 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.402802 2574 manager.go:341] "Starting Device Plugin manager" Apr 20 20:05:45.416173 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.402830 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:05:45.416173 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.402839 2574 server.go:85] "Starting device plugin registration server" Apr 20 20:05:45.416173 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.403023 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:05:45.416173 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.403047 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:05:45.416173 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.403142 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:05:45.416173 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.403230 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:05:45.416173 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.403240 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:05:45.416173 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.404004 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:05:45.416173 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.404049 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:45.476718 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.476669 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:05:45.477907 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.477887 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:05:45.477996 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.477911 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:05:45.477996 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.477927 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:05:45.477996 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.477934 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:05:45.477996 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.477963 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:05:45.480496 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.480480 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:45.503581 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.503560 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:45.504412 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.504399 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:45.504468 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.504428 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:45.504468 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.504442 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:45.504468 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.504463 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.510431 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.510418 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.510479 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.510437 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-59.ec2.internal\": node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:45.529897 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.529879 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:45.579647 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.579626 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal"] Apr 20 20:05:45.579696 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.579689 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:45.580870 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.580855 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:45.580928 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.580884 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:45.580928 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.580894 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:45.582103 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.582091 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:45.582248 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.582232 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.582301 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.582289 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:45.582781 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.582759 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:45.582781 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.582774 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:45.582918 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.582791 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:45.582918 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.582798 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:45.582918 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.582804 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:45.582918 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.582811 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:45.584049 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.584032 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.584127 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.584062 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:45.584736 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.584721 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:45.584796 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.584758 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:45.584796 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.584772 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:45.605882 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.605863 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-59.ec2.internal\" not found" node="ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.610210 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.610193 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-59.ec2.internal\" not found" node="ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.630242 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.630222 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:45.640354 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.640335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ba01067d8d4b51ceee17bbe157f40932-config\") pod \"kube-apiserver-proxy-ip-10-0-139-59.ec2.internal\" (UID: \"ba01067d8d4b51ceee17bbe157f40932\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.640444 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.640358 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d9c0efa17a7cb126cc03dc4516e395e4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal\" (UID: \"d9c0efa17a7cb126cc03dc4516e395e4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.640444 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.640376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9c0efa17a7cb126cc03dc4516e395e4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal\" (UID: \"d9c0efa17a7cb126cc03dc4516e395e4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.730743 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.730696 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:45.741217 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.741196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d9c0efa17a7cb126cc03dc4516e395e4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal\" (UID: \"d9c0efa17a7cb126cc03dc4516e395e4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.741293 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.741224 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9c0efa17a7cb126cc03dc4516e395e4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal\" (UID: \"d9c0efa17a7cb126cc03dc4516e395e4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.741293 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.741247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ba01067d8d4b51ceee17bbe157f40932-config\") pod \"kube-apiserver-proxy-ip-10-0-139-59.ec2.internal\" (UID: \"ba01067d8d4b51ceee17bbe157f40932\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.741371 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.741292 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9c0efa17a7cb126cc03dc4516e395e4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal\" (UID: \"d9c0efa17a7cb126cc03dc4516e395e4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.741371 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.741298 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d9c0efa17a7cb126cc03dc4516e395e4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal\" (UID: \"d9c0efa17a7cb126cc03dc4516e395e4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.741371 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.741335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ba01067d8d4b51ceee17bbe157f40932-config\") pod \"kube-apiserver-proxy-ip-10-0-139-59.ec2.internal\" (UID: \"ba01067d8d4b51ceee17bbe157f40932\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.831525 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.831503 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:45.907988 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.907964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.912504 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:45.912488 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal" Apr 20 20:05:45.932036 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:45.932016 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:46.032610 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:46.032558 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:46.133054 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:46.133032 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:46.233603 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:46.233580 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:46.243908 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.243890 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:05:46.244029 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.244014 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:05:46.334086 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:46.334030 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-59.ec2.internal\" not found" Apr 20 20:05:46.337184 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.337170 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:46.346493 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.346457 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:00:45 +0000 UTC" deadline="2028-01-02 19:47:32.352705038 +0000 UTC" Apr 20 20:05:46.346493 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.346487 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14927h41m46.006222523s" Apr 20 20:05:46.351965 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.351950 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:46.423971 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.423948 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:46.437927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.437902 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" Apr 20 20:05:46.448772 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.448753 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:46.450360 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.450348 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal" Apr 20 20:05:46.458012 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.457995 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:46.459116 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.459100 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-746z6" Apr 20 20:05:46.462919 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.462903 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:46.468290 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.468254 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-746z6" Apr 20 20:05:46.475219 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:46.475189 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c0efa17a7cb126cc03dc4516e395e4.slice/crio-2a5694b988ce14cf6b66ee028c8c4ae31fb90ac9f78121e7370d775e6f27c256 WatchSource:0}: Error finding container 2a5694b988ce14cf6b66ee028c8c4ae31fb90ac9f78121e7370d775e6f27c256: Status 404 returned error can't find the container with id 2a5694b988ce14cf6b66ee028c8c4ae31fb90ac9f78121e7370d775e6f27c256 Apr 20 20:05:46.475579 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:46.475559 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba01067d8d4b51ceee17bbe157f40932.slice/crio-639897ad395d0da5a6324b266ca0c2d66dd2ca78b37b9dea7de1950a8b2f8793 WatchSource:0}: Error finding container 639897ad395d0da5a6324b266ca0c2d66dd2ca78b37b9dea7de1950a8b2f8793: Status 404 returned error can't find the container with id 639897ad395d0da5a6324b266ca0c2d66dd2ca78b37b9dea7de1950a8b2f8793 Apr 20 20:05:46.480077 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.480057 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:05:46.480860 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.480794 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal" event={"ID":"ba01067d8d4b51ceee17bbe157f40932","Type":"ContainerStarted","Data":"639897ad395d0da5a6324b266ca0c2d66dd2ca78b37b9dea7de1950a8b2f8793"} Apr 20 20:05:46.482061 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.482040 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" event={"ID":"d9c0efa17a7cb126cc03dc4516e395e4","Type":"ContainerStarted","Data":"2a5694b988ce14cf6b66ee028c8c4ae31fb90ac9f78121e7370d775e6f27c256"} Apr 20 20:05:46.649735 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:46.649669 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:47.318533 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.318490 2574 apiserver.go:52] "Watching apiserver" Apr 20 20:05:47.325702 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.325682 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:05:47.326052 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.326030 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal","openshift-cluster-node-tuning-operator/tuned-9dtpd","openshift-dns/node-resolver-mt4p7","openshift-image-registry/node-ca-lwzcq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal","openshift-multus/multus-additional-cni-plugins-vqc62","openshift-multus/multus-kgzdk","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl","openshift-multus/network-metrics-daemon-kzfxf","openshift-network-diagnostics/network-check-target-j74qz","openshift-network-operator/iptables-alerter-t2pv6","openshift-ovn-kubernetes/ovnkube-node-8fzv6","kube-system/konnectivity-agent-2chhh"] Apr 20 20:05:47.327842 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.327818 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.330080 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.330054 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.330177 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.330132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.330698 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.330679 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:05:47.330774 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.330679 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:05:47.330822 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.330687 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:05:47.330822 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.330818 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dgxxt\"" Apr 20 20:05:47.330968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.330953 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:05:47.331699 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.331676 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.332369 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.332349 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:47.332640 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.332622 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:05:47.332640 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.332638 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wrl75\"" Apr 20 20:05:47.332806 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.332792 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:05:47.332870 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.332854 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-52c62\"" Apr 20 20:05:47.332920 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.332910 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:47.334204 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.334035 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:05:47.334204 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.334056 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:05:47.334204 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.334064 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-n2vgw\"" Apr 20 20:05:47.334204 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.334059 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:05:47.334472 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.334219 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.335512 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.335494 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.335605 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.335579 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:47.335662 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.335639 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:05:47.336245 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.336224 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:05:47.336338 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.336293 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:05:47.336338 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.336304 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-prfnm\"" Apr 20 20:05:47.337448 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.337423 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:05:47.337551 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.337428 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:05:47.337746 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.337730 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:05:47.337922 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.337910 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nr8hs\"" Apr 20 20:05:47.339137 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.339118 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:47.339232 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.339182 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:05:47.339232 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.339226 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.340915 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.340898 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.341477 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.341460 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:47.341589 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.341570 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f8h82\"" Apr 20 20:05:47.341651 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.341608 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:05:47.341651 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.341617 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:47.342073 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.342059 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:05:47.344950 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.344929 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:05:47.345046 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.344948 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:05:47.349017 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.345653 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:05:47.349017 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.347006 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vl2jc\"" Apr 20 20:05:47.349017 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.347207 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:05:47.349017 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.347291 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:05:47.349017 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.347474 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:05:47.349017 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.347697 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:05:47.349017 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.347841 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:05:47.349017 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.348056 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jzfff\"" Apr 20 20:05:47.349469 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.349448 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a364d155-96d2-457b-8944-52e7438d87e8-host-slash\") pod \"iptables-alerter-t2pv6\" (UID: \"a364d155-96d2-457b-8944-52e7438d87e8\") " pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.349577 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.349488 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-cnibin\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.349577 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.349552 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-socket-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.349682 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.349614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-registration-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.349777 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.349730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-cnibin\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.349834 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.349798 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-run-k8s-cni-cncf-io\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.349967 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.349947 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-sys\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.350057 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.349987 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0d4eff69-9408-40ee-8a0b-0bbf888c3b7d-hosts-file\") pod \"node-resolver-mt4p7\" (UID: \"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d\") " pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.350057 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-cni-dir\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.350162 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350078 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-var-lib-kubelet\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.350162 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-lib-modules\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.350162 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350137 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-os-release\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.350533 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350511 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ktfh\" (UniqueName: \"kubernetes.io/projected/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-kube-api-access-9ktfh\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.350595 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-etc-selinux\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.350595 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqm7\" (UniqueName: \"kubernetes.io/projected/89492d29-88c3-44e3-adc2-eda0304a1081-kube-api-access-wvqm7\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:47.350676 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-system-cni-dir\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.350676 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-host\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.350676 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf736813-f1cd-49b9-8595-27549c6fa9cb-tmp\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.350797 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-os-release\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.350797 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-var-lib-cni-multus\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.350797 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p55b\" (UniqueName: \"kubernetes.io/projected/801646f7-e8a7-4096-8154-5d74d8e4778d-kube-api-access-5p55b\") pod \"node-ca-lwzcq\" (UID: \"801646f7-e8a7-4096-8154-5d74d8e4778d\") " pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.350797 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350784 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.350935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-kubernetes\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.350935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350836 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a364d155-96d2-457b-8944-52e7438d87e8-iptables-alerter-script\") pod \"iptables-alerter-t2pv6\" (UID: \"a364d155-96d2-457b-8944-52e7438d87e8\") " pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.350935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350867 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-conf-dir\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.350935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-etc-kubernetes\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.350935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tf7m\" (UniqueName: \"kubernetes.io/projected/0d4eff69-9408-40ee-8a0b-0bbf888c3b7d-kube-api-access-7tf7m\") pod \"node-resolver-mt4p7\" (UID: \"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d\") " pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.351098 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350939 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.351098 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.350976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-sys-fs\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.351098 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351012 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:47.351098 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351036 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-device-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.351098 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351061 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-socket-dir-parent\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.351098 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-modprobe-d\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-run-multus-certs\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351121 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/801646f7-e8a7-4096-8154-5d74d8e4778d-serviceca\") pod \"node-ca-lwzcq\" (UID: \"801646f7-e8a7-4096-8154-5d74d8e4778d\") " pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-var-lib-cni-bin\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351161 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-sysctl-d\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvfzn\" (UniqueName: \"kubernetes.io/projected/a364d155-96d2-457b-8944-52e7438d87e8-kube-api-access-kvfzn\") pod \"iptables-alerter-t2pv6\" (UID: \"a364d155-96d2-457b-8944-52e7438d87e8\") " pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351207 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351229 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxb58\" (UniqueName: \"kubernetes.io/projected/7e32072c-26b6-4466-b63a-3602df3f45f5-kube-api-access-cxb58\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-run\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351293 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-system-cni-dir\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-hostroot\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351336 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-daemon-config\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-systemd\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.351370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-sysctl-conf\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-var-lib-kubelet\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-tuned\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l956t\" (UniqueName: \"kubernetes.io/projected/cf736813-f1cd-49b9-8595-27549c6fa9cb-kube-api-access-l956t\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351604 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sw5p\" (UniqueName: \"kubernetes.io/projected/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-kube-api-access-8sw5p\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-sysconfig\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351661 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-run-netns\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351731 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d4eff69-9408-40ee-8a0b-0bbf888c3b7d-tmp-dir\") pod \"node-resolver-mt4p7\" (UID: \"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d\") " pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/801646f7-e8a7-4096-8154-5d74d8e4778d-host\") pod \"node-ca-lwzcq\" (UID: \"801646f7-e8a7-4096-8154-5d74d8e4778d\") " pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.351963 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.351778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e32072c-26b6-4466-b63a-3602df3f45f5-cni-binary-copy\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.413996 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.413975 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:47.439962 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.439941 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:05:47.452697 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-run-systemd\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.452791 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3e016fd4-9b61-4ff9-bcfa-b119516354d0-konnectivity-ca\") pod \"konnectivity-agent-2chhh\" (UID: \"3e016fd4-9b61-4ff9-bcfa-b119516354d0\") " pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:05:47.452791 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p55b\" (UniqueName: \"kubernetes.io/projected/801646f7-e8a7-4096-8154-5d74d8e4778d-kube-api-access-5p55b\") pod \"node-ca-lwzcq\" (UID: \"801646f7-e8a7-4096-8154-5d74d8e4778d\") " pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.452791 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.452927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-kubernetes\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.452927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-systemd-units\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.452927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452855 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-cni-bin\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.452927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.452927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a364d155-96d2-457b-8944-52e7438d87e8-iptables-alerter-script\") pod \"iptables-alerter-t2pv6\" (UID: \"a364d155-96d2-457b-8944-52e7438d87e8\") " pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.452927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-conf-dir\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452932 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-etc-kubernetes\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-kubernetes\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tf7m\" (UniqueName: \"kubernetes.io/projected/0d4eff69-9408-40ee-8a0b-0bbf888c3b7d-kube-api-access-7tf7m\") pod \"node-resolver-mt4p7\" (UID: \"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d\") " pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.452984 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453009 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-env-overrides\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-etc-kubernetes\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-sys-fs\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-conf-dir\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453109 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-device-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.453139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-socket-dir-parent\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-modprobe-d\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453191 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3e016fd4-9b61-4ff9-bcfa-b119516354d0-agent-certs\") pod \"konnectivity-agent-2chhh\" (UID: \"3e016fd4-9b61-4ff9-bcfa-b119516354d0\") " pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-run-multus-certs\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453309 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-ovnkube-script-lib\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453339 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/801646f7-e8a7-4096-8154-5d74d8e4778d-serviceca\") pod \"node-ca-lwzcq\" (UID: \"801646f7-e8a7-4096-8154-5d74d8e4778d\") " pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-device-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453371 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-var-lib-cni-bin\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-sysctl-d\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453424 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-etc-openvswitch\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-node-log\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453475 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a364d155-96d2-457b-8944-52e7438d87e8-iptables-alerter-script\") pod \"iptables-alerter-t2pv6\" (UID: \"a364d155-96d2-457b-8944-52e7438d87e8\") " pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.453548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-sys-fs\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453475 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvfzn\" (UniqueName: \"kubernetes.io/projected/a364d155-96d2-457b-8944-52e7438d87e8-kube-api-access-kvfzn\") pod \"iptables-alerter-t2pv6\" (UID: \"a364d155-96d2-457b-8944-52e7438d87e8\") " pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453619 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxb58\" (UniqueName: \"kubernetes.io/projected/7e32072c-26b6-4466-b63a-3602df3f45f5-kube-api-access-cxb58\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-run\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-socket-dir-parent\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-var-lib-openvswitch\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-cni-netd\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-system-cni-dir\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-hostroot\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-modprobe-d\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453822 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-daemon-config\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-systemd\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453868 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-run-multus-certs\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-run-netns\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.453933 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.453932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-hostroot\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.454428 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/801646f7-e8a7-4096-8154-5d74d8e4778d-serviceca\") pod \"node-ca-lwzcq\" (UID: \"801646f7-e8a7-4096-8154-5d74d8e4778d\") " pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.454428 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-run\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.454428 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454144 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-var-lib-cni-bin\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.454428 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-systemd\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.454428 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.454229 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:47.454428 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-run-ovn\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.454428 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.454320 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs podName:89492d29-88c3-44e3-adc2-eda0304a1081 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:47.954286079 +0000 UTC m=+3.063341784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs") pod "network-metrics-daemon-kzfxf" (UID: "89492d29-88c3-44e3-adc2-eda0304a1081") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:47.454428 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-system-cni-dir\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-daemon-config\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-sysctl-d\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-sysctl-conf\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-var-lib-kubelet\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-tuned\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l956t\" (UniqueName: \"kubernetes.io/projected/cf736813-f1cd-49b9-8595-27549c6fa9cb-kube-api-access-l956t\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454659 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sw5p\" (UniqueName: \"kubernetes.io/projected/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-kube-api-access-8sw5p\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-sysconfig\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.454716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-sysctl-conf\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454734 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-sysconfig\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-run-netns\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454776 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-var-lib-kubelet\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d4eff69-9408-40ee-8a0b-0bbf888c3b7d-tmp-dir\") pod \"node-resolver-mt4p7\" (UID: \"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d\") " pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-ovnkube-config\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-ovn-node-metrics-cert\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454873 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/801646f7-e8a7-4096-8154-5d74d8e4778d-host\") pod \"node-ca-lwzcq\" (UID: \"801646f7-e8a7-4096-8154-5d74d8e4778d\") " pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e32072c-26b6-4466-b63a-3602df3f45f5-cni-binary-copy\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454924 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-kubelet\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454949 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75llb\" (UniqueName: \"kubernetes.io/projected/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-kube-api-access-75llb\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a364d155-96d2-457b-8944-52e7438d87e8-host-slash\") pod \"iptables-alerter-t2pv6\" (UID: \"a364d155-96d2-457b-8944-52e7438d87e8\") " pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.454980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455003 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-cnibin\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455031 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-socket-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-run-netns\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.455191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455075 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/801646f7-e8a7-4096-8154-5d74d8e4778d-host\") pod \"node-ca-lwzcq\" (UID: \"801646f7-e8a7-4096-8154-5d74d8e4778d\") " pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-registration-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455127 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-registration-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-cnibin\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455168 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-run-k8s-cni-cncf-io\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455195 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-sys\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0d4eff69-9408-40ee-8a0b-0bbf888c3b7d-hosts-file\") pod \"node-resolver-mt4p7\" (UID: \"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d\") " pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455256 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-cni-dir\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-var-lib-kubelet\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-lib-modules\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455310 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d4eff69-9408-40ee-8a0b-0bbf888c3b7d-tmp-dir\") pod \"node-resolver-mt4p7\" (UID: \"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d\") " pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455359 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-cnibin\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455393 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-slash\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455403 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-run-k8s-cni-cncf-io\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455429 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-sys\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-run-openvswitch\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455472 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-cnibin\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455483 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-log-socket\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.455959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0d4eff69-9408-40ee-8a0b-0bbf888c3b7d-hosts-file\") pod \"node-resolver-mt4p7\" (UID: \"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d\") " pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455044 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-os-release\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-lib-modules\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ktfh\" (UniqueName: \"kubernetes.io/projected/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-kube-api-access-9ktfh\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-etc-selinux\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqm7\" (UniqueName: \"kubernetes.io/projected/89492d29-88c3-44e3-adc2-eda0304a1081-kube-api-access-wvqm7\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-system-cni-dir\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e32072c-26b6-4466-b63a-3602df3f45f5-cni-binary-copy\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-host\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455670 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf736813-f1cd-49b9-8595-27549c6fa9cb-host\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf736813-f1cd-49b9-8595-27549c6fa9cb-tmp\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-os-release\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-var-lib-cni-multus\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-var-lib-cni-multus\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-etc-selinux\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-socket-dir\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455510 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-host-var-lib-kubelet\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.456629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455906 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a364d155-96d2-457b-8944-52e7438d87e8-host-slash\") pod \"iptables-alerter-t2pv6\" (UID: \"a364d155-96d2-457b-8944-52e7438d87e8\") " pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.457203 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.455958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-multus-cni-dir\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.457203 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.456012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-system-cni-dir\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.457203 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.456059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e32072c-26b6-4466-b63a-3602df3f45f5-os-release\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.457203 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.456110 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-os-release\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.457203 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.456781 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.457203 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.457002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.458704 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.458680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf736813-f1cd-49b9-8595-27549c6fa9cb-tmp\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.459736 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.459715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cf736813-f1cd-49b9-8595-27549c6fa9cb-etc-tuned\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.463760 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.463741 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:47.463843 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.463764 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:47.463843 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.463776 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5dkzn for pod openshift-network-diagnostics/network-check-target-j74qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:47.463843 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.463837 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn podName:cfcf938f-bded-4945-9620-473df1d1b448 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:47.963814366 +0000 UTC m=+3.072870051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5dkzn" (UniqueName: "kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn") pod "network-check-target-j74qz" (UID: "cfcf938f-bded-4945-9620-473df1d1b448") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:47.465964 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.465941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvfzn\" (UniqueName: \"kubernetes.io/projected/a364d155-96d2-457b-8944-52e7438d87e8-kube-api-access-kvfzn\") pod \"iptables-alerter-t2pv6\" (UID: \"a364d155-96d2-457b-8944-52e7438d87e8\") " pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.466336 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.466277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p55b\" (UniqueName: \"kubernetes.io/projected/801646f7-e8a7-4096-8154-5d74d8e4778d-kube-api-access-5p55b\") pod \"node-ca-lwzcq\" (UID: \"801646f7-e8a7-4096-8154-5d74d8e4778d\") " pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.466795 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.466764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sw5p\" (UniqueName: \"kubernetes.io/projected/cb68a606-a80c-4153-89a1-5d9b5bcc8c7e-kube-api-access-8sw5p\") pod \"aws-ebs-csi-driver-node-krcbl\" (UID: \"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.467077 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.467051 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ktfh\" (UniqueName: \"kubernetes.io/projected/0389bcb6-aeb1-4436-a30f-8c26b9b175a1-kube-api-access-9ktfh\") pod \"multus-additional-cni-plugins-vqc62\" (UID: \"0389bcb6-aeb1-4436-a30f-8c26b9b175a1\") " pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.467547 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.467520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqm7\" (UniqueName: \"kubernetes.io/projected/89492d29-88c3-44e3-adc2-eda0304a1081-kube-api-access-wvqm7\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:47.467958 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.467925 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l956t\" (UniqueName: \"kubernetes.io/projected/cf736813-f1cd-49b9-8595-27549c6fa9cb-kube-api-access-l956t\") pod \"tuned-9dtpd\" (UID: \"cf736813-f1cd-49b9-8595-27549c6fa9cb\") " pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.468580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.468517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tf7m\" (UniqueName: \"kubernetes.io/projected/0d4eff69-9408-40ee-8a0b-0bbf888c3b7d-kube-api-access-7tf7m\") pod \"node-resolver-mt4p7\" (UID: \"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d\") " pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.468859 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.468835 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxb58\" (UniqueName: \"kubernetes.io/projected/7e32072c-26b6-4466-b63a-3602df3f45f5-kube-api-access-cxb58\") pod \"multus-kgzdk\" (UID: \"7e32072c-26b6-4466-b63a-3602df3f45f5\") " pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.468953 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.468914 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:46 +0000 UTC" deadline="2027-12-12 21:48:01.567016171 +0000 UTC" Apr 20 20:05:47.469001 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.468953 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14425h42m14.098072593s" Apr 20 20:05:47.556761 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556761 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556766 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3e016fd4-9b61-4ff9-bcfa-b119516354d0-agent-certs\") pod \"konnectivity-agent-2chhh\" (UID: \"3e016fd4-9b61-4ff9-bcfa-b119516354d0\") " pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556783 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-ovnkube-script-lib\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-etc-openvswitch\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556821 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-node-log\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-var-lib-openvswitch\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-cni-netd\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556893 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-etc-openvswitch\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556919 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-cni-netd\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556928 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-run-netns\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556950 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-var-lib-openvswitch\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-run-netns\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.556968 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556962 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-run-ovn\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-node-log\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556999 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-run-ovn\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.556996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-ovnkube-config\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-ovn-node-metrics-cert\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-kubelet\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75llb\" (UniqueName: \"kubernetes.io/projected/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-kube-api-access-75llb\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-slash\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-kubelet\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-run-openvswitch\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557195 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-slash\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-log-socket\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-run-systemd\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3e016fd4-9b61-4ff9-bcfa-b119516354d0-konnectivity-ca\") pod \"konnectivity-agent-2chhh\" (UID: \"3e016fd4-9b61-4ff9-bcfa-b119516354d0\") " pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-systemd-units\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-cni-bin\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557371 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-log-socket\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557388 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.557757 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-env-overrides\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.558627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557434 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.558627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-ovnkube-script-lib\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.558627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-ovnkube-config\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.558627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557498 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-host-cni-bin\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.558627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557504 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-run-openvswitch\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.558627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-run-systemd\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.558627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557544 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-systemd-units\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.558627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557762 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-env-overrides\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.558627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.557948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3e016fd4-9b61-4ff9-bcfa-b119516354d0-konnectivity-ca\") pod \"konnectivity-agent-2chhh\" (UID: \"3e016fd4-9b61-4ff9-bcfa-b119516354d0\") " pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:05:47.559454 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.559428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3e016fd4-9b61-4ff9-bcfa-b119516354d0-agent-certs\") pod \"konnectivity-agent-2chhh\" (UID: \"3e016fd4-9b61-4ff9-bcfa-b119516354d0\") " pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:05:47.559937 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.559916 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-ovn-node-metrics-cert\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.565183 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.565165 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75llb\" (UniqueName: \"kubernetes.io/projected/5beb43cd-4e2d-436b-baba-c0a6aac03ff2-kube-api-access-75llb\") pod \"ovnkube-node-8fzv6\" (UID: \"5beb43cd-4e2d-436b-baba-c0a6aac03ff2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.640241 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.640166 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kgzdk" Apr 20 20:05:47.648501 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.648037 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lwzcq" Apr 20 20:05:47.655949 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.655928 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" Apr 20 20:05:47.662656 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.662638 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mt4p7" Apr 20 20:05:47.677228 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.677203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vqc62" Apr 20 20:05:47.683842 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.683824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" Apr 20 20:05:47.692375 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.692355 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t2pv6" Apr 20 20:05:47.698921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.698896 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:05:47.702517 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.702496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:05:47.961048 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:47.960967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:47.961183 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.961107 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:47.961183 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:47.961165 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs podName:89492d29-88c3-44e3-adc2-eda0304a1081 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:48.961149843 +0000 UTC m=+4.070205528 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs") pod "network-metrics-daemon-kzfxf" (UID: "89492d29-88c3-44e3-adc2-eda0304a1081") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:48.061741 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.061712 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:48.061868 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:48.061822 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:48.061868 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:48.061838 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:48.061868 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:48.061847 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5dkzn for pod openshift-network-diagnostics/network-check-target-j74qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:48.061985 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:48.061892 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn podName:cfcf938f-bded-4945-9620-473df1d1b448 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:49.06187824 +0000 UTC m=+4.170933925 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5dkzn" (UniqueName: "kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn") pod "network-check-target-j74qz" (UID: "cfcf938f-bded-4945-9620-473df1d1b448") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:48.099168 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:48.099125 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d4eff69_9408_40ee_8a0b_0bbf888c3b7d.slice/crio-4c76dfb3b07417ad3854ec34c264a6c7dcbe6d42498344b0b661338d73b8c97a WatchSource:0}: Error finding container 4c76dfb3b07417ad3854ec34c264a6c7dcbe6d42498344b0b661338d73b8c97a: Status 404 returned error can't find the container with id 4c76dfb3b07417ad3854ec34c264a6c7dcbe6d42498344b0b661338d73b8c97a Apr 20 20:05:48.101855 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:48.101832 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5beb43cd_4e2d_436b_baba_c0a6aac03ff2.slice/crio-d24b2c6ec24a69fb4eb161691c4a41981781edaa0298fdfa240120f64a1266f0 WatchSource:0}: Error finding container d24b2c6ec24a69fb4eb161691c4a41981781edaa0298fdfa240120f64a1266f0: Status 404 returned error can't find the container with id d24b2c6ec24a69fb4eb161691c4a41981781edaa0298fdfa240120f64a1266f0 Apr 20 20:05:48.103368 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:48.103253 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e016fd4_9b61_4ff9_bcfa_b119516354d0.slice/crio-1db970adf6b59c9665379a8250d859d2cc286f10f2fe9533b4d58fa5a0c9a8ae WatchSource:0}: Error finding container 1db970adf6b59c9665379a8250d859d2cc286f10f2fe9533b4d58fa5a0c9a8ae: Status 404 returned error can't find the container with id 1db970adf6b59c9665379a8250d859d2cc286f10f2fe9533b4d58fa5a0c9a8ae Apr 20 20:05:48.104774 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:48.104320 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb68a606_a80c_4153_89a1_5d9b5bcc8c7e.slice/crio-7ae2838fe1b1c570c156349da5d55bae04739fe1aee9f956a52b4304bbf39f7e WatchSource:0}: Error finding container 7ae2838fe1b1c570c156349da5d55bae04739fe1aee9f956a52b4304bbf39f7e: Status 404 returned error can't find the container with id 7ae2838fe1b1c570c156349da5d55bae04739fe1aee9f956a52b4304bbf39f7e Apr 20 20:05:48.105628 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:48.105610 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf736813_f1cd_49b9_8595_27549c6fa9cb.slice/crio-c54b9bd4f29ccd9bf639866d4a4bc7237c0d5a5bd079c9d4e8a98ffb10a79320 WatchSource:0}: Error finding container c54b9bd4f29ccd9bf639866d4a4bc7237c0d5a5bd079c9d4e8a98ffb10a79320: Status 404 returned error can't find the container with id c54b9bd4f29ccd9bf639866d4a4bc7237c0d5a5bd079c9d4e8a98ffb10a79320 Apr 20 20:05:48.105893 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:48.105870 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda364d155_96d2_457b_8944_52e7438d87e8.slice/crio-c76df5f8a8a9cc9230ec729c9d764d5f73bdbcbeec0f8b38a89ce3635f965586 WatchSource:0}: Error finding container c76df5f8a8a9cc9230ec729c9d764d5f73bdbcbeec0f8b38a89ce3635f965586: Status 404 returned error can't find the container with id c76df5f8a8a9cc9230ec729c9d764d5f73bdbcbeec0f8b38a89ce3635f965586 Apr 20 20:05:48.107726 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:48.107382 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801646f7_e8a7_4096_8154_5d74d8e4778d.slice/crio-a1a3f31b0d97e667cfef9c4f82e65eec58c818529cd7222d3e162966ce632267 WatchSource:0}: Error finding container a1a3f31b0d97e667cfef9c4f82e65eec58c818529cd7222d3e162966ce632267: Status 404 returned error can't find the container with id a1a3f31b0d97e667cfef9c4f82e65eec58c818529cd7222d3e162966ce632267 Apr 20 20:05:48.107913 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:48.107890 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e32072c_26b6_4466_b63a_3602df3f45f5.slice/crio-9ead6542a306137830b45baa6b60f13699c78f93ba469f0ea899bcc2e0d8c67e WatchSource:0}: Error finding container 9ead6542a306137830b45baa6b60f13699c78f93ba469f0ea899bcc2e0d8c67e: Status 404 returned error can't find the container with id 9ead6542a306137830b45baa6b60f13699c78f93ba469f0ea899bcc2e0d8c67e Apr 20 20:05:48.109880 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:05:48.109859 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0389bcb6_aeb1_4436_a30f_8c26b9b175a1.slice/crio-c0486314e2dde48a40f4baa428f66811ff12b0a4cd187fd9208ee2510c915e4f WatchSource:0}: Error finding container c0486314e2dde48a40f4baa428f66811ff12b0a4cd187fd9208ee2510c915e4f: Status 404 returned error can't find the container with id c0486314e2dde48a40f4baa428f66811ff12b0a4cd187fd9208ee2510c915e4f Apr 20 20:05:48.469488 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.469306 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:46 +0000 UTC" deadline="2027-11-10 07:10:10.750244271 +0000 UTC" Apr 20 20:05:48.469488 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.469484 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13643h4m22.280762521s" Apr 20 20:05:48.485851 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.485820 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mt4p7" event={"ID":"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d","Type":"ContainerStarted","Data":"4c76dfb3b07417ad3854ec34c264a6c7dcbe6d42498344b0b661338d73b8c97a"} Apr 20 20:05:48.486984 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.486957 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqc62" event={"ID":"0389bcb6-aeb1-4436-a30f-8c26b9b175a1","Type":"ContainerStarted","Data":"c0486314e2dde48a40f4baa428f66811ff12b0a4cd187fd9208ee2510c915e4f"} Apr 20 20:05:48.488000 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.487974 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t2pv6" event={"ID":"a364d155-96d2-457b-8944-52e7438d87e8","Type":"ContainerStarted","Data":"c76df5f8a8a9cc9230ec729c9d764d5f73bdbcbeec0f8b38a89ce3635f965586"} Apr 20 20:05:48.489384 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.489353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" event={"ID":"cf736813-f1cd-49b9-8595-27549c6fa9cb","Type":"ContainerStarted","Data":"c54b9bd4f29ccd9bf639866d4a4bc7237c0d5a5bd079c9d4e8a98ffb10a79320"} Apr 20 20:05:48.490718 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.490695 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" event={"ID":"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e","Type":"ContainerStarted","Data":"7ae2838fe1b1c570c156349da5d55bae04739fe1aee9f956a52b4304bbf39f7e"} Apr 20 20:05:48.492119 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.492088 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerStarted","Data":"d24b2c6ec24a69fb4eb161691c4a41981781edaa0298fdfa240120f64a1266f0"} Apr 20 20:05:48.494446 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.494420 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal" event={"ID":"ba01067d8d4b51ceee17bbe157f40932","Type":"ContainerStarted","Data":"6690f398a75075c5de8da32084e704e9c56febf9b4dd343ccf053155a71191ca"} Apr 20 20:05:48.495857 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.495831 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kgzdk" event={"ID":"7e32072c-26b6-4466-b63a-3602df3f45f5","Type":"ContainerStarted","Data":"9ead6542a306137830b45baa6b60f13699c78f93ba469f0ea899bcc2e0d8c67e"} Apr 20 20:05:48.497205 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.497181 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lwzcq" event={"ID":"801646f7-e8a7-4096-8154-5d74d8e4778d","Type":"ContainerStarted","Data":"a1a3f31b0d97e667cfef9c4f82e65eec58c818529cd7222d3e162966ce632267"} Apr 20 20:05:48.503278 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.500394 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2chhh" event={"ID":"3e016fd4-9b61-4ff9-bcfa-b119516354d0","Type":"ContainerStarted","Data":"1db970adf6b59c9665379a8250d859d2cc286f10f2fe9533b4d58fa5a0c9a8ae"} Apr 20 20:05:48.971209 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:48.971177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:48.971544 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:48.971335 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:48.971544 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:48.971396 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs podName:89492d29-88c3-44e3-adc2-eda0304a1081 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:50.971377454 +0000 UTC m=+6.080433145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs") pod "network-metrics-daemon-kzfxf" (UID: "89492d29-88c3-44e3-adc2-eda0304a1081") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:49.072080 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:49.071991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:49.072219 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:49.072144 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:49.072219 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:49.072163 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:49.072219 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:49.072175 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5dkzn for pod openshift-network-diagnostics/network-check-target-j74qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:49.072405 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:49.072232 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn podName:cfcf938f-bded-4945-9620-473df1d1b448 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:51.072212809 +0000 UTC m=+6.181268510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5dkzn" (UniqueName: "kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn") pod "network-check-target-j74qz" (UID: "cfcf938f-bded-4945-9620-473df1d1b448") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:49.481873 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:49.481110 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:49.481873 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:49.481232 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:05:49.481873 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:49.481659 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:49.481873 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:49.481763 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:05:49.514389 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:49.514357 2574 generic.go:358] "Generic (PLEG): container finished" podID="d9c0efa17a7cb126cc03dc4516e395e4" containerID="52b1684210cc0d61f5fcd4e70c17f4dab21863d84ea423e963184389f1234e3c" exitCode=0 Apr 20 20:05:49.514837 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:49.514819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" event={"ID":"d9c0efa17a7cb126cc03dc4516e395e4","Type":"ContainerDied","Data":"52b1684210cc0d61f5fcd4e70c17f4dab21863d84ea423e963184389f1234e3c"} Apr 20 20:05:49.534200 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:49.534150 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-59.ec2.internal" podStartSLOduration=3.534133496 podStartE2EDuration="3.534133496s" podCreationTimestamp="2026-04-20 20:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:48.506605867 +0000 UTC m=+3.615661575" watchObservedRunningTime="2026-04-20 20:05:49.534133496 +0000 UTC m=+4.643189204" Apr 20 20:05:50.521240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:50.521200 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" event={"ID":"d9c0efa17a7cb126cc03dc4516e395e4","Type":"ContainerStarted","Data":"27d437b9df74ef768d0b9aad9bdf977803793f3f092aacfa223495357ebea90f"} Apr 20 20:05:50.537778 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:50.537635 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-59.ec2.internal" podStartSLOduration=4.537617116 podStartE2EDuration="4.537617116s" podCreationTimestamp="2026-04-20 20:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:50.536232112 +0000 UTC m=+5.645287821" watchObservedRunningTime="2026-04-20 20:05:50.537617116 +0000 UTC m=+5.646672823" Apr 20 20:05:50.997889 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:50.997304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:50.997889 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:50.997447 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:50.997889 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:50.997510 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs podName:89492d29-88c3-44e3-adc2-eda0304a1081 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:54.997490399 +0000 UTC m=+10.106546090 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs") pod "network-metrics-daemon-kzfxf" (UID: "89492d29-88c3-44e3-adc2-eda0304a1081") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:51.098051 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:51.098018 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:51.098220 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:51.098171 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:51.098220 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:51.098190 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:51.098220 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:51.098202 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5dkzn for pod openshift-network-diagnostics/network-check-target-j74qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:51.098408 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:51.098256 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn podName:cfcf938f-bded-4945-9620-473df1d1b448 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:55.098237445 +0000 UTC m=+10.207293151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5dkzn" (UniqueName: "kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn") pod "network-check-target-j74qz" (UID: "cfcf938f-bded-4945-9620-473df1d1b448") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:51.480406 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:51.480324 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:51.480574 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:51.480455 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:05:51.480574 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:51.480547 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:51.480687 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:51.480668 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:05:53.478525 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:53.478489 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:53.479033 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:53.478616 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:05:53.479033 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:53.479026 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:53.479164 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:53.479133 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:05:55.031460 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:55.031416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:55.031873 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:55.031571 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:55.031873 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:55.031619 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs podName:89492d29-88c3-44e3-adc2-eda0304a1081 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:03.03160588 +0000 UTC m=+18.140661565 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs") pod "network-metrics-daemon-kzfxf" (UID: "89492d29-88c3-44e3-adc2-eda0304a1081") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:55.132169 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:55.132134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:55.132347 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:55.132332 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:55.132423 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:55.132351 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:55.132423 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:55.132364 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5dkzn for pod openshift-network-diagnostics/network-check-target-j74qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:55.132517 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:55.132424 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn podName:cfcf938f-bded-4945-9620-473df1d1b448 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:03.132404899 +0000 UTC m=+18.241460604 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5dkzn" (UniqueName: "kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn") pod "network-check-target-j74qz" (UID: "cfcf938f-bded-4945-9620-473df1d1b448") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:55.485006 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:55.484977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:55.485170 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:55.484950 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:55.485379 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:55.485350 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:05:55.485458 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:55.485366 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:05:57.479152 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:57.479099 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:57.479600 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:57.479238 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:05:57.479600 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:57.479318 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:57.479600 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:57.479431 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:05:59.478630 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:59.478592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:05:59.479077 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:05:59.478602 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:05:59.479077 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:59.478744 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:05:59.479077 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:05:59.478798 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:01.478501 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:01.478470 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:01.478951 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:01.478597 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:01.478951 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:01.478479 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:01.479065 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:01.478969 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:03.091649 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:03.091620 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:03.092063 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:03.091732 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:03.092063 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:03.091789 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs podName:89492d29-88c3-44e3-adc2-eda0304a1081 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:19.091773641 +0000 UTC m=+34.200829330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs") pod "network-metrics-daemon-kzfxf" (UID: "89492d29-88c3-44e3-adc2-eda0304a1081") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:03.192175 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:03.192137 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:03.192336 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:03.192324 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:03.192390 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:03.192350 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:03.192390 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:03.192364 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5dkzn for pod openshift-network-diagnostics/network-check-target-j74qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:03.192491 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:03.192425 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn podName:cfcf938f-bded-4945-9620-473df1d1b448 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:19.192405044 +0000 UTC m=+34.301460741 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5dkzn" (UniqueName: "kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn") pod "network-check-target-j74qz" (UID: "cfcf938f-bded-4945-9620-473df1d1b448") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:03.478831 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:03.478791 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:03.478989 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:03.478853 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:03.478989 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:03.478916 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:03.478989 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:03.478975 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:05.480064 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:05.479634 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:05.480064 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:05.479757 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:05.480064 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:05.479808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:05.480064 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:05.479915 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:06.552471 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.552181 2574 generic.go:358] "Generic (PLEG): container finished" podID="0389bcb6-aeb1-4436-a30f-8c26b9b175a1" containerID="033a72a01acd23713e3abd774c33f2383c4cc64a1d9a4046fca06c8414d69481" exitCode=0 Apr 20 20:06:06.552471 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.552285 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqc62" event={"ID":"0389bcb6-aeb1-4436-a30f-8c26b9b175a1","Type":"ContainerDied","Data":"033a72a01acd23713e3abd774c33f2383c4cc64a1d9a4046fca06c8414d69481"} Apr 20 20:06:06.554162 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.554137 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" event={"ID":"cf736813-f1cd-49b9-8595-27549c6fa9cb","Type":"ContainerStarted","Data":"61c99d92290adccf0a0f6b58161698b66208fff5ca7af6fefba77f92fd7f4691"} Apr 20 20:06:06.555726 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.555699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" event={"ID":"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e","Type":"ContainerStarted","Data":"669d4bc2a0e4521467c0d77e630dff51379bc083b5f497c7bc5d7dfc284c96f1"} Apr 20 20:06:06.558475 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.558452 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:06:06.558822 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.558799 2574 generic.go:358] "Generic (PLEG): container finished" podID="5beb43cd-4e2d-436b-baba-c0a6aac03ff2" containerID="1adb66e333b14c8a29ad8650436df8560664bc50f15710347395c7327a8f0ad6" exitCode=1 Apr 20 20:06:06.558922 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.558879 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerStarted","Data":"2cf8e9a27e9b0fb3d3310a6636c8626d837c78fefd1f53a7ae3420fd08ecfe31"} Apr 20 20:06:06.558922 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.558905 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerStarted","Data":"80d80a6a816c0d71a64025ebda8b251029704b92a64bef29beb4a8879c6fb418"} Apr 20 20:06:06.558922 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.558918 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerStarted","Data":"b49e2e4d5ea6be03b908dbe0818e176d603e946a94b1d5849e6a6215e208a824"} Apr 20 20:06:06.559078 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.558931 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerStarted","Data":"e6e968307e7ba68a7a4fa96f00b6b30f5c3db2c8bd68134bc3d37d2d12cc2105"} Apr 20 20:06:06.559078 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.558947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerDied","Data":"1adb66e333b14c8a29ad8650436df8560664bc50f15710347395c7327a8f0ad6"} Apr 20 20:06:06.559078 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.558962 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerStarted","Data":"e606882e3a538afc4ca7dcc5f9fd6fd87fd547679bd1a9025adc2f783ac9a441"} Apr 20 20:06:06.560430 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.560413 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kgzdk" event={"ID":"7e32072c-26b6-4466-b63a-3602df3f45f5","Type":"ContainerStarted","Data":"8cb56cd6c4ec15be228fd60764a2d9f8a9fc3c7a2538be7d09d7a92d05a63b2a"} Apr 20 20:06:06.561809 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.561787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lwzcq" event={"ID":"801646f7-e8a7-4096-8154-5d74d8e4778d","Type":"ContainerStarted","Data":"fe388e11c718382e19c6c4d7a69899c9fecff562e990ac068b7d146d2452f69b"} Apr 20 20:06:06.563176 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.563149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2chhh" event={"ID":"3e016fd4-9b61-4ff9-bcfa-b119516354d0","Type":"ContainerStarted","Data":"77f9e07a132c0294ae890ac9997ab729bba5603350fc9a2e382173abfab72e88"} Apr 20 20:06:06.564664 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.564639 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mt4p7" event={"ID":"0d4eff69-9408-40ee-8a0b-0bbf888c3b7d","Type":"ContainerStarted","Data":"aff95330bf3fce6d5d90f0c4a154949b5161ffa4980367d89b8bd0c8e5891004"} Apr 20 20:06:06.591837 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.591803 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kgzdk" podStartSLOduration=4.237058375 podStartE2EDuration="21.59179296s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:05:48.111204595 +0000 UTC m=+3.220260295" lastFinishedPulling="2026-04-20 20:06:05.465939191 +0000 UTC m=+20.574994880" observedRunningTime="2026-04-20 20:06:06.59154539 +0000 UTC m=+21.700601097" watchObservedRunningTime="2026-04-20 20:06:06.59179296 +0000 UTC m=+21.700848667" Apr 20 20:06:06.609363 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.609323 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9dtpd" podStartSLOduration=4.292634807 podStartE2EDuration="21.609312994s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:05:48.107610894 +0000 UTC m=+3.216666593" lastFinishedPulling="2026-04-20 20:06:05.424289082 +0000 UTC m=+20.533344780" observedRunningTime="2026-04-20 20:06:06.609058766 +0000 UTC m=+21.718114476" watchObservedRunningTime="2026-04-20 20:06:06.609312994 +0000 UTC m=+21.718368701" Apr 20 20:06:06.622568 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.622538 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lwzcq" podStartSLOduration=4.309246326 podStartE2EDuration="21.622527682s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:05:48.110891485 +0000 UTC m=+3.219947187" lastFinishedPulling="2026-04-20 20:06:05.424172846 +0000 UTC m=+20.533228543" observedRunningTime="2026-04-20 20:06:06.622442618 +0000 UTC m=+21.731498324" watchObservedRunningTime="2026-04-20 20:06:06.622527682 +0000 UTC m=+21.731583379" Apr 20 20:06:06.638208 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.638177 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mt4p7" podStartSLOduration=4.315312308 podStartE2EDuration="21.638169123s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:05:48.101058879 +0000 UTC m=+3.210114583" lastFinishedPulling="2026-04-20 20:06:05.423915697 +0000 UTC m=+20.532971398" observedRunningTime="2026-04-20 20:06:06.63795491 +0000 UTC m=+21.747010616" watchObservedRunningTime="2026-04-20 20:06:06.638169123 +0000 UTC m=+21.747224830" Apr 20 20:06:06.657473 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.657447 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2chhh" podStartSLOduration=12.477730454 podStartE2EDuration="21.657437373s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:05:48.104969227 +0000 UTC m=+3.214024927" lastFinishedPulling="2026-04-20 20:05:57.284676159 +0000 UTC m=+12.393731846" observedRunningTime="2026-04-20 20:06:06.65705573 +0000 UTC m=+21.766111436" watchObservedRunningTime="2026-04-20 20:06:06.657437373 +0000 UTC m=+21.766493080" Apr 20 20:06:06.791355 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:06.791256 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:06:07.413520 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.413419 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:06:06.791352208Z","UUID":"705affd3-be46-4501-878d-13152486532c","Handler":null,"Name":"","Endpoint":""} Apr 20 20:06:07.416788 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.416753 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:06:07.416788 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.416784 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:06:07.478709 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.478650 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:07.478709 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.478713 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:07.478957 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:07.478806 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:07.478957 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:07.478925 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:07.568149 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.568112 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t2pv6" event={"ID":"a364d155-96d2-457b-8944-52e7438d87e8","Type":"ContainerStarted","Data":"b05c6ed5a6c9dcb776b179d91ba3c3127c2b0543944885821bcdb4db12431f61"} Apr 20 20:06:07.570765 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.570598 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" event={"ID":"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e","Type":"ContainerStarted","Data":"f351bd12d4c85ed0d8e0097f82631b0fb14a57094b20ce77695feb6c6e798bd7"} Apr 20 20:06:07.577959 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.577936 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:06:07.578519 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.578500 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:06:07.585502 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:07.585458 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-t2pv6" podStartSLOduration=5.270986027 podStartE2EDuration="22.585442295s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:05:48.109418858 +0000 UTC m=+3.218474543" lastFinishedPulling="2026-04-20 20:06:05.423875109 +0000 UTC m=+20.532930811" observedRunningTime="2026-04-20 20:06:07.584776607 +0000 UTC m=+22.693832313" watchObservedRunningTime="2026-04-20 20:06:07.585442295 +0000 UTC m=+22.694498004" Apr 20 20:06:08.575470 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:08.574828 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" event={"ID":"cb68a606-a80c-4153-89a1-5d9b5bcc8c7e","Type":"ContainerStarted","Data":"03bcdb32c08f87a5a477e1f6c88cd39c2568c7b2172213eb4fe5717a11b534ff"} Apr 20 20:06:08.577908 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:08.577888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:06:08.578248 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:08.578221 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerStarted","Data":"b7848e406e0e8f50dbf5779a49c88a5a3bba331993f4f823ac335a9739f1bc8a"} Apr 20 20:06:08.578997 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:08.578982 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:06:08.579436 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:08.579420 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2chhh" Apr 20 20:06:08.603905 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:08.603864 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krcbl" podStartSLOduration=3.719963484 podStartE2EDuration="23.603849449s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:05:48.10646429 +0000 UTC m=+3.215519978" lastFinishedPulling="2026-04-20 20:06:07.990350242 +0000 UTC m=+23.099405943" observedRunningTime="2026-04-20 20:06:08.603597366 +0000 UTC m=+23.712653072" watchObservedRunningTime="2026-04-20 20:06:08.603849449 +0000 UTC m=+23.712905155" Apr 20 20:06:09.479125 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:09.479095 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:09.479356 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:09.479137 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:09.479356 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:09.479222 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:09.479439 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:09.479384 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:11.478792 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.478540 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:11.479408 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.478546 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:11.479408 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:11.478913 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:11.479408 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:11.478935 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:11.587097 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.587071 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:06:11.587399 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.587369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerStarted","Data":"242eb74229d60ee6de6f42d0a07acf6c98262cd0bc5455c5130b8a451a29e603"} Apr 20 20:06:11.587716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.587693 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:06:11.587829 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.587724 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:06:11.587891 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.587863 2574 scope.go:117] "RemoveContainer" containerID="1adb66e333b14c8a29ad8650436df8560664bc50f15710347395c7327a8f0ad6" Apr 20 20:06:11.589070 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.589049 2574 generic.go:358] "Generic (PLEG): container finished" podID="0389bcb6-aeb1-4436-a30f-8c26b9b175a1" containerID="aac1882b62ac39c1d221de98d33a485cfa74af5921a3db8fa8a6a9e83fc38120" exitCode=0 Apr 20 20:06:11.589151 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.589086 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqc62" event={"ID":"0389bcb6-aeb1-4436-a30f-8c26b9b175a1","Type":"ContainerDied","Data":"aac1882b62ac39c1d221de98d33a485cfa74af5921a3db8fa8a6a9e83fc38120"} Apr 20 20:06:11.604468 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:11.604413 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:06:12.592232 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.592203 2574 generic.go:358] "Generic (PLEG): container finished" podID="0389bcb6-aeb1-4436-a30f-8c26b9b175a1" containerID="5a7fd904a33aca25208a174980d17aeffedf89e5e6d7558bc24f17645abc0fc7" exitCode=0 Apr 20 20:06:12.592585 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.592293 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqc62" event={"ID":"0389bcb6-aeb1-4436-a30f-8c26b9b175a1","Type":"ContainerDied","Data":"5a7fd904a33aca25208a174980d17aeffedf89e5e6d7558bc24f17645abc0fc7"} Apr 20 20:06:12.599974 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.597135 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:06:12.599974 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.597653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" event={"ID":"5beb43cd-4e2d-436b-baba-c0a6aac03ff2","Type":"ContainerStarted","Data":"89d08052f90a4cb8c1ae67bb9e482753210ee821db753fcfb1c39df268e2113e"} Apr 20 20:06:12.599974 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.598110 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:06:12.609211 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.609191 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j74qz"] Apr 20 20:06:12.609330 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.609313 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:12.609394 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:12.609381 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:12.611743 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.611722 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kzfxf"] Apr 20 20:06:12.611876 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.611864 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:12.612004 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:12.611981 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:12.615625 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.615608 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:06:12.641562 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:12.641491 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" podStartSLOduration=10.240877977 podStartE2EDuration="27.641481033s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:05:48.103691876 +0000 UTC m=+3.212747564" lastFinishedPulling="2026-04-20 20:06:05.50429492 +0000 UTC m=+20.613350620" observedRunningTime="2026-04-20 20:06:12.640234091 +0000 UTC m=+27.749289800" watchObservedRunningTime="2026-04-20 20:06:12.641481033 +0000 UTC m=+27.750536740" Apr 20 20:06:13.601736 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:13.601649 2574 generic.go:358] "Generic (PLEG): container finished" podID="0389bcb6-aeb1-4436-a30f-8c26b9b175a1" containerID="daec3d130ed960630f991f98f2e1d8954c84152de7eb1185fb9561be22dc304e" exitCode=0 Apr 20 20:06:13.602158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:13.601742 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqc62" event={"ID":"0389bcb6-aeb1-4436-a30f-8c26b9b175a1","Type":"ContainerDied","Data":"daec3d130ed960630f991f98f2e1d8954c84152de7eb1185fb9561be22dc304e"} Apr 20 20:06:14.478373 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:14.478338 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:14.478373 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:14.478381 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:14.478599 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:14.478488 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:14.478657 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:14.478613 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:16.479135 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:16.479100 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:16.479806 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:16.479100 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:16.479806 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:16.479243 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:16.479806 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:16.479325 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:18.478896 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.478868 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:18.479346 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.478866 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:18.479346 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:18.478989 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:06:18.479346 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:18.479057 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j74qz" podUID="cfcf938f-bded-4945-9620-473df1d1b448" Apr 20 20:06:18.717874 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.717837 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-59.ec2.internal" event="NodeReady" Apr 20 20:06:18.718044 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.717974 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:06:18.769613 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.769584 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jxlts"] Apr 20 20:06:18.786452 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.786422 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q92l2"] Apr 20 20:06:18.786609 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.786587 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:18.788906 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.788878 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:06:18.788906 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.788897 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m22kk\"" Apr 20 20:06:18.789087 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.788939 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:06:18.808151 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.808128 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q92l2"] Apr 20 20:06:18.808151 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.808154 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:18.808343 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.808159 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jxlts"] Apr 20 20:06:18.810457 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.810436 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:06:18.810457 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.810453 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:06:18.810634 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.810436 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:06:18.810634 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.810438 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wkgqm\"" Apr 20 20:06:18.914709 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.914673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxm7\" (UniqueName: \"kubernetes.io/projected/a53459d8-2c1c-4399-801a-d69f56977702-kube-api-access-4cxm7\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:18.914882 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.914719 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:18.914882 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.914771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdbrh\" (UniqueName: \"kubernetes.io/projected/d2fc277b-80e6-4be4-a366-c742f661aa43-kube-api-access-tdbrh\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:18.914882 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.914794 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:18.914882 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.914864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a53459d8-2c1c-4399-801a-d69f56977702-config-volume\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:18.915056 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:18.914932 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a53459d8-2c1c-4399-801a-d69f56977702-tmp-dir\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:19.016081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.016013 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdbrh\" (UniqueName: \"kubernetes.io/projected/d2fc277b-80e6-4be4-a366-c742f661aa43-kube-api-access-tdbrh\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:19.016081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.016052 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:19.016281 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.016159 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:19.016281 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.016198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a53459d8-2c1c-4399-801a-d69f56977702-config-volume\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:19.016281 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.016208 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls podName:a53459d8-2c1c-4399-801a-d69f56977702 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:19.516192708 +0000 UTC m=+34.625248393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls") pod "dns-default-jxlts" (UID: "a53459d8-2c1c-4399-801a-d69f56977702") : secret "dns-default-metrics-tls" not found Apr 20 20:06:19.016281 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.016251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a53459d8-2c1c-4399-801a-d69f56977702-tmp-dir\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:19.016468 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.016309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxm7\" (UniqueName: \"kubernetes.io/projected/a53459d8-2c1c-4399-801a-d69f56977702-kube-api-access-4cxm7\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:19.016468 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.016327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:19.016468 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.016405 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:19.016468 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.016465 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert podName:d2fc277b-80e6-4be4-a366-c742f661aa43 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:19.516448903 +0000 UTC m=+34.625504590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert") pod "ingress-canary-q92l2" (UID: "d2fc277b-80e6-4be4-a366-c742f661aa43") : secret "canary-serving-cert" not found Apr 20 20:06:19.016641 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.016600 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a53459d8-2c1c-4399-801a-d69f56977702-tmp-dir\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:19.016802 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.016784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a53459d8-2c1c-4399-801a-d69f56977702-config-volume\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:19.026802 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.026782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxm7\" (UniqueName: \"kubernetes.io/projected/a53459d8-2c1c-4399-801a-d69f56977702-kube-api-access-4cxm7\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:19.026905 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.026844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdbrh\" (UniqueName: \"kubernetes.io/projected/d2fc277b-80e6-4be4-a366-c742f661aa43-kube-api-access-tdbrh\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:19.116934 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.116908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:19.117045 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.117029 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:19.117086 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.117082 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs podName:89492d29-88c3-44e3-adc2-eda0304a1081 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:51.117069398 +0000 UTC m=+66.226125082 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs") pod "network-metrics-daemon-kzfxf" (UID: "89492d29-88c3-44e3-adc2-eda0304a1081") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:19.217531 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.217506 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:19.217633 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.217624 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:19.217676 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.217636 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:19.217676 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.217645 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5dkzn for pod openshift-network-diagnostics/network-check-target-j74qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:19.217743 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.217694 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn podName:cfcf938f-bded-4945-9620-473df1d1b448 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:51.217682946 +0000 UTC m=+66.326738630 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5dkzn" (UniqueName: "kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn") pod "network-check-target-j74qz" (UID: "cfcf938f-bded-4945-9620-473df1d1b448") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:19.519699 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.519672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:19.520152 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.519745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:19.520152 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.519799 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:19.520152 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.519861 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert podName:d2fc277b-80e6-4be4-a366-c742f661aa43 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:20.519846594 +0000 UTC m=+35.628902283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert") pod "ingress-canary-q92l2" (UID: "d2fc277b-80e6-4be4-a366-c742f661aa43") : secret "canary-serving-cert" not found Apr 20 20:06:19.520152 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.519862 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:19.520152 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:19.519911 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls podName:a53459d8-2c1c-4399-801a-d69f56977702 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:20.519896159 +0000 UTC m=+35.628951863 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls") pod "dns-default-jxlts" (UID: "a53459d8-2c1c-4399-801a-d69f56977702") : secret "dns-default-metrics-tls" not found Apr 20 20:06:19.618398 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:19.618370 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqc62" event={"ID":"0389bcb6-aeb1-4436-a30f-8c26b9b175a1","Type":"ContainerStarted","Data":"3f5b3402d5d3995c452f3eae9f4689c4ac884828aa3d43fd034574a0811c44dc"} Apr 20 20:06:20.478405 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.478377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:20.478559 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.478430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:20.482072 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.482041 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gwc22\"" Apr 20 20:06:20.482072 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.482042 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:06:20.482231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.482082 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:20.482231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.482042 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:06:20.482231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.482044 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5w7vj\"" Apr 20 20:06:20.525674 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.525653 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:20.525950 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.525701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:20.525950 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:20.525781 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:20.525950 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:20.525789 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:20.525950 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:20.525828 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert podName:d2fc277b-80e6-4be4-a366-c742f661aa43 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:22.525814148 +0000 UTC m=+37.634869832 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert") pod "ingress-canary-q92l2" (UID: "d2fc277b-80e6-4be4-a366-c742f661aa43") : secret "canary-serving-cert" not found Apr 20 20:06:20.525950 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:20.525840 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls podName:a53459d8-2c1c-4399-801a-d69f56977702 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:22.525834698 +0000 UTC m=+37.634890382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls") pod "dns-default-jxlts" (UID: "a53459d8-2c1c-4399-801a-d69f56977702") : secret "dns-default-metrics-tls" not found Apr 20 20:06:20.622618 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.622596 2574 generic.go:358] "Generic (PLEG): container finished" podID="0389bcb6-aeb1-4436-a30f-8c26b9b175a1" containerID="3f5b3402d5d3995c452f3eae9f4689c4ac884828aa3d43fd034574a0811c44dc" exitCode=0 Apr 20 20:06:20.622742 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:20.622641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqc62" event={"ID":"0389bcb6-aeb1-4436-a30f-8c26b9b175a1","Type":"ContainerDied","Data":"3f5b3402d5d3995c452f3eae9f4689c4ac884828aa3d43fd034574a0811c44dc"} Apr 20 20:06:21.626966 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:21.626726 2574 generic.go:358] "Generic (PLEG): container finished" podID="0389bcb6-aeb1-4436-a30f-8c26b9b175a1" containerID="18d8ed66875cef2ddc52f6897ce7b1924d6746a3f91a24da2d9280610a898772" exitCode=0 Apr 20 20:06:21.627349 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:21.626806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqc62" event={"ID":"0389bcb6-aeb1-4436-a30f-8c26b9b175a1","Type":"ContainerDied","Data":"18d8ed66875cef2ddc52f6897ce7b1924d6746a3f91a24da2d9280610a898772"} Apr 20 20:06:22.537195 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:22.537129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:22.537195 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:22.537172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:22.537382 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:22.537280 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:22.537382 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:22.537282 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:22.537382 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:22.537329 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls podName:a53459d8-2c1c-4399-801a-d69f56977702 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:26.537316247 +0000 UTC m=+41.646371931 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls") pod "dns-default-jxlts" (UID: "a53459d8-2c1c-4399-801a-d69f56977702") : secret "dns-default-metrics-tls" not found Apr 20 20:06:22.537382 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:22.537343 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert podName:d2fc277b-80e6-4be4-a366-c742f661aa43 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:26.537337461 +0000 UTC m=+41.646393145 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert") pod "ingress-canary-q92l2" (UID: "d2fc277b-80e6-4be4-a366-c742f661aa43") : secret "canary-serving-cert" not found Apr 20 20:06:22.632928 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:22.632894 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqc62" event={"ID":"0389bcb6-aeb1-4436-a30f-8c26b9b175a1","Type":"ContainerStarted","Data":"e57e8cba26bd4aafdc4f20d58bbdadd8258d074450f47907067397e1a484aa37"} Apr 20 20:06:22.657252 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:22.657209 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vqc62" podStartSLOduration=6.318405509 podStartE2EDuration="37.657197576s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:05:48.111948497 +0000 UTC m=+3.221004186" lastFinishedPulling="2026-04-20 20:06:19.450740568 +0000 UTC m=+34.559796253" observedRunningTime="2026-04-20 20:06:22.65593694 +0000 UTC m=+37.764992656" watchObservedRunningTime="2026-04-20 20:06:22.657197576 +0000 UTC m=+37.766253281" Apr 20 20:06:26.563700 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:26.563667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:26.564150 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:26.563721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:26.564150 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:26.563788 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:26.564150 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:26.563811 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:26.564150 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:26.563847 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert podName:d2fc277b-80e6-4be4-a366-c742f661aa43 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:34.563832389 +0000 UTC m=+49.672888078 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert") pod "ingress-canary-q92l2" (UID: "d2fc277b-80e6-4be4-a366-c742f661aa43") : secret "canary-serving-cert" not found Apr 20 20:06:26.564150 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:26.563862 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls podName:a53459d8-2c1c-4399-801a-d69f56977702 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:34.563855275 +0000 UTC m=+49.672910963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls") pod "dns-default-jxlts" (UID: "a53459d8-2c1c-4399-801a-d69f56977702") : secret "dns-default-metrics-tls" not found Apr 20 20:06:34.617863 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:34.617827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:34.618328 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:34.617884 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:34.618328 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:34.617959 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:34.618328 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:34.617961 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:34.618328 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:34.618011 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert podName:d2fc277b-80e6-4be4-a366-c742f661aa43 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:50.61799622 +0000 UTC m=+65.727051904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert") pod "ingress-canary-q92l2" (UID: "d2fc277b-80e6-4be4-a366-c742f661aa43") : secret "canary-serving-cert" not found Apr 20 20:06:34.618328 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:34.618025 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls podName:a53459d8-2c1c-4399-801a-d69f56977702 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:50.618018189 +0000 UTC m=+65.727073873 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls") pod "dns-default-jxlts" (UID: "a53459d8-2c1c-4399-801a-d69f56977702") : secret "dns-default-metrics-tls" not found Apr 20 20:06:44.614863 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:44.614833 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fzv6" Apr 20 20:06:50.631607 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:50.631569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:06:50.631983 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:50.631625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:06:50.631983 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:50.631724 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:50.631983 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:50.631785 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls podName:a53459d8-2c1c-4399-801a-d69f56977702 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:22.631768771 +0000 UTC m=+97.740824456 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls") pod "dns-default-jxlts" (UID: "a53459d8-2c1c-4399-801a-d69f56977702") : secret "dns-default-metrics-tls" not found Apr 20 20:06:50.631983 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:50.631722 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:50.631983 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:50.631853 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert podName:d2fc277b-80e6-4be4-a366-c742f661aa43 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:22.631842481 +0000 UTC m=+97.740898169 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert") pod "ingress-canary-q92l2" (UID: "d2fc277b-80e6-4be4-a366-c742f661aa43") : secret "canary-serving-cert" not found Apr 20 20:06:51.135225 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.135154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:06:51.138030 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.138002 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:06:51.145489 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:51.145462 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:06:51.145603 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:06:51.145539 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs podName:89492d29-88c3-44e3-adc2-eda0304a1081 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:55.145516671 +0000 UTC m=+130.254572356 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs") pod "network-metrics-daemon-kzfxf" (UID: "89492d29-88c3-44e3-adc2-eda0304a1081") : secret "metrics-daemon-secret" not found Apr 20 20:06:51.235861 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.235823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:51.238683 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.238663 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:06:51.248320 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.248294 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:51.260212 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.260189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dkzn\" (UniqueName: \"kubernetes.io/projected/cfcf938f-bded-4945-9620-473df1d1b448-kube-api-access-5dkzn\") pod \"network-check-target-j74qz\" (UID: \"cfcf938f-bded-4945-9620-473df1d1b448\") " pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:51.395414 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.395336 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5w7vj\"" Apr 20 20:06:51.402328 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.402308 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:51.527926 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.527899 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j74qz"] Apr 20 20:06:51.531251 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:06:51.531220 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfcf938f_bded_4945_9620_473df1d1b448.slice/crio-4fc957aa6fddac82ac472eb445b03c2377a02d1baaa86ee162f694112c2fc0cf WatchSource:0}: Error finding container 4fc957aa6fddac82ac472eb445b03c2377a02d1baaa86ee162f694112c2fc0cf: Status 404 returned error can't find the container with id 4fc957aa6fddac82ac472eb445b03c2377a02d1baaa86ee162f694112c2fc0cf Apr 20 20:06:51.686185 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:51.686142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j74qz" event={"ID":"cfcf938f-bded-4945-9620-473df1d1b448","Type":"ContainerStarted","Data":"4fc957aa6fddac82ac472eb445b03c2377a02d1baaa86ee162f694112c2fc0cf"} Apr 20 20:06:54.692693 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:54.692664 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j74qz" event={"ID":"cfcf938f-bded-4945-9620-473df1d1b448","Type":"ContainerStarted","Data":"0568cdf6069b03beaf7f66c96d9465393b67cf87ac3e7fbb1d8a63a8db904aa9"} Apr 20 20:06:54.692991 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:54.692781 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:06:54.707590 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:06:54.707511 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-j74qz" podStartSLOduration=66.67147834 podStartE2EDuration="1m9.707492751s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:06:51.533009341 +0000 UTC m=+66.642065030" lastFinishedPulling="2026-04-20 20:06:54.569023753 +0000 UTC m=+69.678079441" observedRunningTime="2026-04-20 20:06:54.706853723 +0000 UTC m=+69.815909430" watchObservedRunningTime="2026-04-20 20:06:54.707492751 +0000 UTC m=+69.816548458" Apr 20 20:07:22.646336 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:22.646228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:07:22.646336 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:22.646320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:07:22.646829 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:22.646386 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:07:22.646829 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:22.646450 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:07:22.646829 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:22.646454 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert podName:d2fc277b-80e6-4be4-a366-c742f661aa43 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:26.646438071 +0000 UTC m=+161.755493760 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert") pod "ingress-canary-q92l2" (UID: "d2fc277b-80e6-4be4-a366-c742f661aa43") : secret "canary-serving-cert" not found Apr 20 20:07:22.646829 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:22.646523 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls podName:a53459d8-2c1c-4399-801a-d69f56977702 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:26.646507576 +0000 UTC m=+161.755563262 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls") pod "dns-default-jxlts" (UID: "a53459d8-2c1c-4399-801a-d69f56977702") : secret "dns-default-metrics-tls" not found Apr 20 20:07:25.697541 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:25.697510 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-j74qz" Apr 20 20:07:40.507065 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.507027 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7pjcj"] Apr 20 20:07:40.509446 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.509425 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.511924 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.511891 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 20:07:40.511924 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.511920 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:07:40.512103 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.511895 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-hfvmp\"" Apr 20 20:07:40.512103 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.511903 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:07:40.512798 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.512767 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 20:07:40.516581 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.516562 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 20:07:40.517465 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.517437 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7pjcj"] Apr 20 20:07:40.563115 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.563090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48d8dc00-2133-4e50-9e06-45cb14a568c8-service-ca-bundle\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.563200 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.563130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48d8dc00-2133-4e50-9e06-45cb14a568c8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.563200 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.563157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48d8dc00-2133-4e50-9e06-45cb14a568c8-serving-cert\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.563286 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.563211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/48d8dc00-2133-4e50-9e06-45cb14a568c8-snapshots\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.563286 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.563244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48d8dc00-2133-4e50-9e06-45cb14a568c8-tmp\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.563354 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.563283 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjdb4\" (UniqueName: \"kubernetes.io/projected/48d8dc00-2133-4e50-9e06-45cb14a568c8-kube-api-access-sjdb4\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.609678 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.609655 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7"] Apr 20 20:07:40.611398 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.611309 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7" Apr 20 20:07:40.611984 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.611952 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd"] Apr 20 20:07:40.613721 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.613707 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-57f8d89fbb-xxxkj"] Apr 20 20:07:40.613852 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.613836 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:40.614398 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.614381 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-52qgs\"" Apr 20 20:07:40.614492 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.614380 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:40.614492 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.614414 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:40.615295 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.615278 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.616021 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.616005 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 20:07:40.616339 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.616318 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:07:40.616498 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.616465 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-9tpbw\"" Apr 20 20:07:40.617344 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.617313 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:07:40.617344 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.617341 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 20:07:40.617487 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.617394 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 20:07:40.617487 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.617475 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 20:07:40.617591 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.617486 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 20:07:40.617591 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.617541 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 20:07:40.617591 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.617481 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-94f5q\"" Apr 20 20:07:40.617858 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.617844 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 20:07:40.618541 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.618524 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 20:07:40.622302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.622282 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7"] Apr 20 20:07:40.630580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.630442 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd"] Apr 20 20:07:40.642054 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.642035 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-57f8d89fbb-xxxkj"] Apr 20 20:07:40.663631 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.663609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjdb4\" (UniqueName: \"kubernetes.io/projected/48d8dc00-2133-4e50-9e06-45cb14a568c8-kube-api-access-sjdb4\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.663750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.663640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prbrp\" (UniqueName: \"kubernetes.io/projected/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-kube-api-access-prbrp\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.663750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.663673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48d8dc00-2133-4e50-9e06-45cb14a568c8-service-ca-bundle\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.663750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.663730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mdrr\" (UniqueName: \"kubernetes.io/projected/5e691ef5-adc1-46b2-bcdb-be0d299cad21-kube-api-access-8mdrr\") pod \"volume-data-source-validator-7c6cbb6c87-gv5n7\" (UID: \"5e691ef5-adc1-46b2-bcdb-be0d299cad21\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7" Apr 20 20:07:40.664116 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.663915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48d8dc00-2133-4e50-9e06-45cb14a568c8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.664116 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.663975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-default-certificate\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.664116 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48d8dc00-2133-4e50-9e06-45cb14a568c8-serving-cert\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.664304 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.664304 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/48d8dc00-2133-4e50-9e06-45cb14a568c8-snapshots\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.664304 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/05d00a46-2c87-449e-b9c2-6274c763b555-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:40.664304 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664228 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bhrt\" (UniqueName: \"kubernetes.io/projected/05d00a46-2c87-449e-b9c2-6274c763b555-kube-api-access-4bhrt\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:40.664499 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-stats-auth\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.664499 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:40.664499 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48d8dc00-2133-4e50-9e06-45cb14a568c8-tmp\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.664499 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48d8dc00-2133-4e50-9e06-45cb14a568c8-service-ca-bundle\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.664499 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664411 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.664787 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48d8dc00-2133-4e50-9e06-45cb14a568c8-tmp\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.664852 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/48d8dc00-2133-4e50-9e06-45cb14a568c8-snapshots\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.664887 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.664861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48d8dc00-2133-4e50-9e06-45cb14a568c8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.666483 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.666462 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48d8dc00-2133-4e50-9e06-45cb14a568c8-serving-cert\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.671069 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.671044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjdb4\" (UniqueName: \"kubernetes.io/projected/48d8dc00-2133-4e50-9e06-45cb14a568c8-kube-api-access-sjdb4\") pod \"insights-operator-585dfdc468-7pjcj\" (UID: \"48d8dc00-2133-4e50-9e06-45cb14a568c8\") " pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.764976 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.764907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mdrr\" (UniqueName: \"kubernetes.io/projected/5e691ef5-adc1-46b2-bcdb-be0d299cad21-kube-api-access-8mdrr\") pod \"volume-data-source-validator-7c6cbb6c87-gv5n7\" (UID: \"5e691ef5-adc1-46b2-bcdb-be0d299cad21\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7" Apr 20 20:07:40.764976 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.764954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-default-certificate\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.765158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.764988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.765158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.765015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/05d00a46-2c87-449e-b9c2-6274c763b555-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:40.765158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.765140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bhrt\" (UniqueName: \"kubernetes.io/projected/05d00a46-2c87-449e-b9c2-6274c763b555-kube-api-access-4bhrt\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:40.765346 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:40.765155 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:41.26513196 +0000 UTC m=+116.374187664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:40.765346 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.765199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-stats-auth\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.765346 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.765232 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:40.765346 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.765303 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.765346 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.765337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prbrp\" (UniqueName: \"kubernetes.io/projected/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-kube-api-access-prbrp\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.765584 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:40.765397 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:40.765584 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:40.765405 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:40.765584 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:40.765470 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls podName:05d00a46-2c87-449e-b9c2-6274c763b555 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:41.265453663 +0000 UTC m=+116.374509349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w5vnd" (UID: "05d00a46-2c87-449e-b9c2-6274c763b555") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:40.765584 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:40.765491 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:41.265479478 +0000 UTC m=+116.374535163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : secret "router-metrics-certs-default" not found Apr 20 20:07:40.766316 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.766290 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/05d00a46-2c87-449e-b9c2-6274c763b555-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:40.767790 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.767764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-default-certificate\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.767864 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.767767 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-stats-auth\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.774450 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.774409 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mdrr\" (UniqueName: \"kubernetes.io/projected/5e691ef5-adc1-46b2-bcdb-be0d299cad21-kube-api-access-8mdrr\") pod \"volume-data-source-validator-7c6cbb6c87-gv5n7\" (UID: \"5e691ef5-adc1-46b2-bcdb-be0d299cad21\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7" Apr 20 20:07:40.774626 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.774604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bhrt\" (UniqueName: \"kubernetes.io/projected/05d00a46-2c87-449e-b9c2-6274c763b555-kube-api-access-4bhrt\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:40.774895 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.774873 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prbrp\" (UniqueName: \"kubernetes.io/projected/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-kube-api-access-prbrp\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:40.818772 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.818742 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-7pjcj" Apr 20 20:07:40.921699 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.921672 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7" Apr 20 20:07:40.932607 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:40.932584 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7pjcj"] Apr 20 20:07:40.935772 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:07:40.935747 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48d8dc00_2133_4e50_9e06_45cb14a568c8.slice/crio-c5de59de218861c0aedfe0da6fba924927f6d0a486e3295f124c5051473ae443 WatchSource:0}: Error finding container c5de59de218861c0aedfe0da6fba924927f6d0a486e3295f124c5051473ae443: Status 404 returned error can't find the container with id c5de59de218861c0aedfe0da6fba924927f6d0a486e3295f124c5051473ae443 Apr 20 20:07:41.032542 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:41.032479 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7"] Apr 20 20:07:41.037352 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:07:41.037324 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e691ef5_adc1_46b2_bcdb_be0d299cad21.slice/crio-78840414ad3d60aec30d87dd62e0be9fb80d4d37c3cc901cc588755eb34361da WatchSource:0}: Error finding container 78840414ad3d60aec30d87dd62e0be9fb80d4d37c3cc901cc588755eb34361da: Status 404 returned error can't find the container with id 78840414ad3d60aec30d87dd62e0be9fb80d4d37c3cc901cc588755eb34361da Apr 20 20:07:41.271076 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:41.271045 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:41.271220 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:41.271085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:41.271220 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:41.271144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:41.271328 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:41.271250 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:41.271328 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:41.271286 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:42.271252846 +0000 UTC m=+117.380308557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:41.271422 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:41.271334 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:41.271422 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:41.271382 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:42.27136609 +0000 UTC m=+117.380421776 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : secret "router-metrics-certs-default" not found Apr 20 20:07:41.271422 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:41.271401 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls podName:05d00a46-2c87-449e-b9c2-6274c763b555 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:42.271394851 +0000 UTC m=+117.380450536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w5vnd" (UID: "05d00a46-2c87-449e-b9c2-6274c763b555") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:41.782407 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:41.782353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7pjcj" event={"ID":"48d8dc00-2133-4e50-9e06-45cb14a568c8","Type":"ContainerStarted","Data":"c5de59de218861c0aedfe0da6fba924927f6d0a486e3295f124c5051473ae443"} Apr 20 20:07:41.783511 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:41.783369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7" event={"ID":"5e691ef5-adc1-46b2-bcdb-be0d299cad21","Type":"ContainerStarted","Data":"78840414ad3d60aec30d87dd62e0be9fb80d4d37c3cc901cc588755eb34361da"} Apr 20 20:07:42.281488 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:42.281444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:42.281680 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:42.281508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:42.281680 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:42.281555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:42.281680 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:42.281625 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:44.281601867 +0000 UTC m=+119.390657569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:42.281680 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:42.281680 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:42.281832 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:42.281682 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:42.281832 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:42.281739 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:44.281723714 +0000 UTC m=+119.390779406 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : secret "router-metrics-certs-default" not found Apr 20 20:07:42.281832 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:42.281753 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls podName:05d00a46-2c87-449e-b9c2-6274c763b555 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:44.281746727 +0000 UTC m=+119.390802413 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w5vnd" (UID: "05d00a46-2c87-449e-b9c2-6274c763b555") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:43.789394 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:43.789354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7pjcj" event={"ID":"48d8dc00-2133-4e50-9e06-45cb14a568c8","Type":"ContainerStarted","Data":"397f882571ad01c6ed3fdd5d22c4d554e43494a7b5a07092e76c98b62cfc0030"} Apr 20 20:07:43.790618 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:43.790594 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7" event={"ID":"5e691ef5-adc1-46b2-bcdb-be0d299cad21","Type":"ContainerStarted","Data":"c6cdf48d8afafc87f8f9cb7528f29d996a4f99d379b496055a0e5823cd35e7a2"} Apr 20 20:07:43.805384 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:43.805339 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-7pjcj" podStartSLOduration=1.893629358 podStartE2EDuration="3.805324054s" podCreationTimestamp="2026-04-20 20:07:40 +0000 UTC" firstStartedPulling="2026-04-20 20:07:40.937467737 +0000 UTC m=+116.046523423" lastFinishedPulling="2026-04-20 20:07:42.849162435 +0000 UTC m=+117.958218119" observedRunningTime="2026-04-20 20:07:43.804587395 +0000 UTC m=+118.913643101" watchObservedRunningTime="2026-04-20 20:07:43.805324054 +0000 UTC m=+118.914379766" Apr 20 20:07:43.817988 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:43.817948 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gv5n7" podStartSLOduration=2.011320002 podStartE2EDuration="3.81793386s" podCreationTimestamp="2026-04-20 20:07:40 +0000 UTC" firstStartedPulling="2026-04-20 20:07:41.039119961 +0000 UTC m=+116.148175647" lastFinishedPulling="2026-04-20 20:07:42.84573382 +0000 UTC m=+117.954789505" observedRunningTime="2026-04-20 20:07:43.817726878 +0000 UTC m=+118.926782596" watchObservedRunningTime="2026-04-20 20:07:43.81793386 +0000 UTC m=+118.926989568" Apr 20 20:07:44.296331 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:44.296295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:44.296517 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:44.296339 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:44.296517 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:44.296399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:44.296517 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:44.296434 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:44.296517 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:44.296494 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls podName:05d00a46-2c87-449e-b9c2-6274c763b555 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:48.29647899 +0000 UTC m=+123.405534675 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w5vnd" (UID: "05d00a46-2c87-449e-b9c2-6274c763b555") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:44.296517 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:44.296508 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:48.296502264 +0000 UTC m=+123.405557950 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:44.296517 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:44.296508 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:44.296845 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:44.296545 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:48.296533204 +0000 UTC m=+123.405588889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : secret "router-metrics-certs-default" not found Apr 20 20:07:45.454678 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.454641 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2"] Apr 20 20:07:45.456487 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.456472 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:45.458872 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.458851 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 20:07:45.458988 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.458970 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:45.459864 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.459847 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:45.459922 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.459880 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-l257q\"" Apr 20 20:07:45.472098 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.472078 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2"] Apr 20 20:07:45.606421 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.606394 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:45.606549 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.606502 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74ml\" (UniqueName: \"kubernetes.io/projected/d221fc13-b016-4303-a194-cc93d82ca26f-kube-api-access-c74ml\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:45.707739 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.707677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:45.707739 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.707729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c74ml\" (UniqueName: \"kubernetes.io/projected/d221fc13-b016-4303-a194-cc93d82ca26f-kube-api-access-c74ml\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:45.707862 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:45.707813 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:45.707902 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:45.707878 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls podName:d221fc13-b016-4303-a194-cc93d82ca26f nodeName:}" failed. No retries permitted until 2026-04-20 20:07:46.207864005 +0000 UTC m=+121.316919690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qrqt2" (UID: "d221fc13-b016-4303-a194-cc93d82ca26f") : secret "samples-operator-tls" not found Apr 20 20:07:45.718471 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:45.718444 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74ml\" (UniqueName: \"kubernetes.io/projected/d221fc13-b016-4303-a194-cc93d82ca26f-kube-api-access-c74ml\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:46.211301 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.211252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:46.211532 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:46.211401 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:46.211532 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:46.211463 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls podName:d221fc13-b016-4303-a194-cc93d82ca26f nodeName:}" failed. No retries permitted until 2026-04-20 20:07:47.211447487 +0000 UTC m=+122.320503173 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qrqt2" (UID: "d221fc13-b016-4303-a194-cc93d82ca26f") : secret "samples-operator-tls" not found Apr 20 20:07:46.373680 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.373652 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mt4p7_0d4eff69-9408-40ee-8a0b-0bbf888c3b7d/dns-node-resolver/0.log" Apr 20 20:07:46.440430 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.440396 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-blpbt"] Apr 20 20:07:46.442185 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.442170 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.444527 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.444506 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 20:07:46.444527 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.444519 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:46.444666 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.444565 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4bmz5\"" Apr 20 20:07:46.445497 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.445477 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 20:07:46.445590 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.445500 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:46.449496 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.449479 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 20:07:46.452934 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.452916 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-blpbt"] Apr 20 20:07:46.513055 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.512993 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-serving-cert\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.513350 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.513066 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-config\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.513350 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.513094 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psrbk\" (UniqueName: \"kubernetes.io/projected/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-kube-api-access-psrbk\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.513350 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.513129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-trusted-ca\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.614045 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.614020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-serving-cert\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.614128 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.614109 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-config\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.614164 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.614129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psrbk\" (UniqueName: \"kubernetes.io/projected/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-kube-api-access-psrbk\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.614198 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.614160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-trusted-ca\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.614834 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.614811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-config\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.614952 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.614934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-trusted-ca\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.616446 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.616427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-serving-cert\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.622539 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.622520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psrbk\" (UniqueName: \"kubernetes.io/projected/5bf2bb25-bfba-4f17-b4fc-7607da4bb789-kube-api-access-psrbk\") pod \"console-operator-9d4b6777b-blpbt\" (UID: \"5bf2bb25-bfba-4f17-b4fc-7607da4bb789\") " pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.751395 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.751355 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:46.865938 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:46.865901 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-blpbt"] Apr 20 20:07:46.869317 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:07:46.869277 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf2bb25_bfba_4f17_b4fc_7607da4bb789.slice/crio-5b4a8e9b3314433992e11b2ccf8da13ab70df9bcdb4545e5d05ec4e3d97085d0 WatchSource:0}: Error finding container 5b4a8e9b3314433992e11b2ccf8da13ab70df9bcdb4545e5d05ec4e3d97085d0: Status 404 returned error can't find the container with id 5b4a8e9b3314433992e11b2ccf8da13ab70df9bcdb4545e5d05ec4e3d97085d0 Apr 20 20:07:47.174194 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:47.174172 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lwzcq_801646f7-e8a7-4096-8154-5d74d8e4778d/node-ca/0.log" Apr 20 20:07:47.219067 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:47.219042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:47.219169 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:47.219161 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:47.219218 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:47.219211 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls podName:d221fc13-b016-4303-a194-cc93d82ca26f nodeName:}" failed. No retries permitted until 2026-04-20 20:07:49.219198154 +0000 UTC m=+124.328253839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qrqt2" (UID: "d221fc13-b016-4303-a194-cc93d82ca26f") : secret "samples-operator-tls" not found Apr 20 20:07:47.802614 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:47.802569 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" event={"ID":"5bf2bb25-bfba-4f17-b4fc-7607da4bb789","Type":"ContainerStarted","Data":"5b4a8e9b3314433992e11b2ccf8da13ab70df9bcdb4545e5d05ec4e3d97085d0"} Apr 20 20:07:48.327376 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:48.327344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:48.327562 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:48.327426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:48.327562 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:48.327468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:48.327562 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:48.327497 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:48.327562 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:48.327556 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:56.32753775 +0000 UTC m=+131.436593454 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:48.327744 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:48.327570 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:48.327744 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:48.327598 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:56.327579647 +0000 UTC m=+131.436635347 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : secret "router-metrics-certs-default" not found Apr 20 20:07:48.327744 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:48.327618 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls podName:05d00a46-2c87-449e-b9c2-6274c763b555 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:56.327608651 +0000 UTC m=+131.436664340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w5vnd" (UID: "05d00a46-2c87-449e-b9c2-6274c763b555") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:48.806027 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:48.806000 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/0.log" Apr 20 20:07:48.806393 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:48.806040 2574 generic.go:358] "Generic (PLEG): container finished" podID="5bf2bb25-bfba-4f17-b4fc-7607da4bb789" containerID="f1c05e89d730608b420f9f38d8c3a20dfd9cda33eda6e661d93bc6b1d4ee1b42" exitCode=255 Apr 20 20:07:48.806393 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:48.806097 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" event={"ID":"5bf2bb25-bfba-4f17-b4fc-7607da4bb789","Type":"ContainerDied","Data":"f1c05e89d730608b420f9f38d8c3a20dfd9cda33eda6e661d93bc6b1d4ee1b42"} Apr 20 20:07:48.806393 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:48.806313 2574 scope.go:117] "RemoveContainer" containerID="f1c05e89d730608b420f9f38d8c3a20dfd9cda33eda6e661d93bc6b1d4ee1b42" Apr 20 20:07:49.233992 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:49.233951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:49.234195 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:49.234136 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:49.234288 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:49.234221 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls podName:d221fc13-b016-4303-a194-cc93d82ca26f nodeName:}" failed. No retries permitted until 2026-04-20 20:07:53.234199316 +0000 UTC m=+128.343255020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qrqt2" (UID: "d221fc13-b016-4303-a194-cc93d82ca26f") : secret "samples-operator-tls" not found Apr 20 20:07:49.809988 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:49.809958 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:07:49.810420 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:49.810362 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/0.log" Apr 20 20:07:49.810420 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:49.810397 2574 generic.go:358] "Generic (PLEG): container finished" podID="5bf2bb25-bfba-4f17-b4fc-7607da4bb789" containerID="69ded6c55d22cf5fa9604e59d82d11b537fb74acd850580a5987775a35f97f95" exitCode=255 Apr 20 20:07:49.810523 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:49.810459 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" event={"ID":"5bf2bb25-bfba-4f17-b4fc-7607da4bb789","Type":"ContainerDied","Data":"69ded6c55d22cf5fa9604e59d82d11b537fb74acd850580a5987775a35f97f95"} Apr 20 20:07:49.810523 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:49.810512 2574 scope.go:117] "RemoveContainer" containerID="f1c05e89d730608b420f9f38d8c3a20dfd9cda33eda6e661d93bc6b1d4ee1b42" Apr 20 20:07:49.810743 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:49.810726 2574 scope.go:117] "RemoveContainer" containerID="69ded6c55d22cf5fa9604e59d82d11b537fb74acd850580a5987775a35f97f95" Apr 20 20:07:49.810935 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:49.810913 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-blpbt_openshift-console-operator(5bf2bb25-bfba-4f17-b4fc-7607da4bb789)\"" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" podUID="5bf2bb25-bfba-4f17-b4fc-7607da4bb789" Apr 20 20:07:50.563104 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.563069 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq"] Apr 20 20:07:50.565141 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.565122 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.568126 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.568094 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:50.568328 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.568304 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:50.568445 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.568429 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 20:07:50.568590 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.568564 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jgbsq\"" Apr 20 20:07:50.571792 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.568719 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 20:07:50.573719 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.573696 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq"] Apr 20 20:07:50.646155 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.646133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6cd082-894a-48c8-a261-a5354bd183fe-config\") pod \"service-ca-operator-d6fc45fc5-vrmgq\" (UID: \"fe6cd082-894a-48c8-a261-a5354bd183fe\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.646305 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.646176 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe6cd082-894a-48c8-a261-a5354bd183fe-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vrmgq\" (UID: \"fe6cd082-894a-48c8-a261-a5354bd183fe\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.646391 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.646318 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spv6r\" (UniqueName: \"kubernetes.io/projected/fe6cd082-894a-48c8-a261-a5354bd183fe-kube-api-access-spv6r\") pod \"service-ca-operator-d6fc45fc5-vrmgq\" (UID: \"fe6cd082-894a-48c8-a261-a5354bd183fe\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.747447 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.747414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6cd082-894a-48c8-a261-a5354bd183fe-config\") pod \"service-ca-operator-d6fc45fc5-vrmgq\" (UID: \"fe6cd082-894a-48c8-a261-a5354bd183fe\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.747564 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.747464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe6cd082-894a-48c8-a261-a5354bd183fe-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vrmgq\" (UID: \"fe6cd082-894a-48c8-a261-a5354bd183fe\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.747564 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.747507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spv6r\" (UniqueName: \"kubernetes.io/projected/fe6cd082-894a-48c8-a261-a5354bd183fe-kube-api-access-spv6r\") pod \"service-ca-operator-d6fc45fc5-vrmgq\" (UID: \"fe6cd082-894a-48c8-a261-a5354bd183fe\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.748544 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.748523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6cd082-894a-48c8-a261-a5354bd183fe-config\") pod \"service-ca-operator-d6fc45fc5-vrmgq\" (UID: \"fe6cd082-894a-48c8-a261-a5354bd183fe\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.749836 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.749817 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe6cd082-894a-48c8-a261-a5354bd183fe-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vrmgq\" (UID: \"fe6cd082-894a-48c8-a261-a5354bd183fe\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.760224 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.760197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spv6r\" (UniqueName: \"kubernetes.io/projected/fe6cd082-894a-48c8-a261-a5354bd183fe-kube-api-access-spv6r\") pod \"service-ca-operator-d6fc45fc5-vrmgq\" (UID: \"fe6cd082-894a-48c8-a261-a5354bd183fe\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.814227 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.814175 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:07:50.814531 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.814494 2574 scope.go:117] "RemoveContainer" containerID="69ded6c55d22cf5fa9604e59d82d11b537fb74acd850580a5987775a35f97f95" Apr 20 20:07:50.814662 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:50.814647 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-blpbt_openshift-console-operator(5bf2bb25-bfba-4f17-b4fc-7607da4bb789)\"" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" podUID="5bf2bb25-bfba-4f17-b4fc-7607da4bb789" Apr 20 20:07:50.877637 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.877614 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" Apr 20 20:07:50.995298 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:50.995257 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq"] Apr 20 20:07:50.998669 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:07:50.998640 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe6cd082_894a_48c8_a261_a5354bd183fe.slice/crio-ca55bc7dde6769aa1f8704bdc9e27cca405461fabce8ffd38cbbbfa22cbddb6e WatchSource:0}: Error finding container ca55bc7dde6769aa1f8704bdc9e27cca405461fabce8ffd38cbbbfa22cbddb6e: Status 404 returned error can't find the container with id ca55bc7dde6769aa1f8704bdc9e27cca405461fabce8ffd38cbbbfa22cbddb6e Apr 20 20:07:51.817223 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:51.817184 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" event={"ID":"fe6cd082-894a-48c8-a261-a5354bd183fe","Type":"ContainerStarted","Data":"ca55bc7dde6769aa1f8704bdc9e27cca405461fabce8ffd38cbbbfa22cbddb6e"} Apr 20 20:07:52.309353 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.309317 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b5c68b54b-76d47"] Apr 20 20:07:52.312072 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.312035 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.314641 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.314620 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:07:52.314784 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.314642 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:07:52.314784 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.314711 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dhv99\"" Apr 20 20:07:52.314784 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.314716 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:07:52.319401 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.319377 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:07:52.324350 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.324329 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b5c68b54b-76d47"] Apr 20 20:07:52.463137 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.463099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.463355 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.463158 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-bound-sa-token\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.463435 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.463354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-installation-pull-secrets\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.463435 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.463409 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-ca-trust-extracted\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.463539 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.463449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-certificates\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.463539 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.463516 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-image-registry-private-configuration\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.463643 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.463550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lwkz\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-kube-api-access-2lwkz\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.463643 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.463576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-trusted-ca\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.564763 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.564714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-installation-pull-secrets\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.564763 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.564754 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-ca-trust-extracted\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.564915 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.564772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-certificates\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.564915 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.564796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-image-registry-private-configuration\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.564915 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.564897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lwkz\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-kube-api-access-2lwkz\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.565053 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.564935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-trusted-ca\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.565053 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.564966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.565146 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.565046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-bound-sa-token\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.565146 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:52.565068 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:52.565146 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:52.565086 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b5c68b54b-76d47: secret "image-registry-tls" not found Apr 20 20:07:52.565146 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.565102 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-ca-trust-extracted\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.565357 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:52.565172 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls podName:6f31ebc7-2994-45d9-9434-7da5efc7d6bc nodeName:}" failed. No retries permitted until 2026-04-20 20:07:53.065150301 +0000 UTC m=+128.174205987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls") pod "image-registry-6b5c68b54b-76d47" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc") : secret "image-registry-tls" not found Apr 20 20:07:52.565417 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.565366 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-certificates\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.565999 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.565977 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-trusted-ca\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.567152 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.567131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-image-registry-private-configuration\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.567318 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.567241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-installation-pull-secrets\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.574211 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.574192 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-bound-sa-token\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.574621 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.574603 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lwkz\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-kube-api-access-2lwkz\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:52.821360 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.821277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" event={"ID":"fe6cd082-894a-48c8-a261-a5354bd183fe","Type":"ContainerStarted","Data":"172ab259bb810cef25062413c45c1896083ec2e75dd8b4457c4e9b7effa20678"} Apr 20 20:07:52.839434 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:52.839391 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" podStartSLOduration=1.297512626 podStartE2EDuration="2.839375564s" podCreationTimestamp="2026-04-20 20:07:50 +0000 UTC" firstStartedPulling="2026-04-20 20:07:51.000576287 +0000 UTC m=+126.109631982" lastFinishedPulling="2026-04-20 20:07:52.542439233 +0000 UTC m=+127.651494920" observedRunningTime="2026-04-20 20:07:52.83876192 +0000 UTC m=+127.947817627" watchObservedRunningTime="2026-04-20 20:07:52.839375564 +0000 UTC m=+127.948431269" Apr 20 20:07:53.069935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:53.069889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:53.070128 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:53.070036 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:53.070128 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:53.070058 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b5c68b54b-76d47: secret "image-registry-tls" not found Apr 20 20:07:53.070251 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:53.070134 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls podName:6f31ebc7-2994-45d9-9434-7da5efc7d6bc nodeName:}" failed. No retries permitted until 2026-04-20 20:07:54.07010252 +0000 UTC m=+129.179158204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls") pod "image-registry-6b5c68b54b-76d47" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc") : secret "image-registry-tls" not found Apr 20 20:07:53.271997 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:53.271955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:07:53.272178 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:53.272103 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:53.272178 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:53.272168 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls podName:d221fc13-b016-4303-a194-cc93d82ca26f nodeName:}" failed. No retries permitted until 2026-04-20 20:08:01.272150915 +0000 UTC m=+136.381206601 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qrqt2" (UID: "d221fc13-b016-4303-a194-cc93d82ca26f") : secret "samples-operator-tls" not found Apr 20 20:07:54.079251 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:54.079215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:54.079663 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:54.079346 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:54.079663 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:54.079366 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b5c68b54b-76d47: secret "image-registry-tls" not found Apr 20 20:07:54.079663 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:54.079416 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls podName:6f31ebc7-2994-45d9-9434-7da5efc7d6bc nodeName:}" failed. No retries permitted until 2026-04-20 20:07:56.079401331 +0000 UTC m=+131.188457016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls") pod "image-registry-6b5c68b54b-76d47" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc") : secret "image-registry-tls" not found Apr 20 20:07:55.188444 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:55.188405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:07:55.188815 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:55.188554 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:07:55.188815 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:55.188618 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs podName:89492d29-88c3-44e3-adc2-eda0304a1081 nodeName:}" failed. No retries permitted until 2026-04-20 20:09:57.188602173 +0000 UTC m=+252.297657858 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs") pod "network-metrics-daemon-kzfxf" (UID: "89492d29-88c3-44e3-adc2-eda0304a1081") : secret "metrics-daemon-secret" not found Apr 20 20:07:56.093569 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:56.093534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:07:56.093731 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:56.093676 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:56.093731 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:56.093695 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b5c68b54b-76d47: secret "image-registry-tls" not found Apr 20 20:07:56.093801 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:56.093748 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls podName:6f31ebc7-2994-45d9-9434-7da5efc7d6bc nodeName:}" failed. No retries permitted until 2026-04-20 20:08:00.093732498 +0000 UTC m=+135.202788188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls") pod "image-registry-6b5c68b54b-76d47" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc") : secret "image-registry-tls" not found Apr 20 20:07:56.396103 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:56.396024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:07:56.396448 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:56.396102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:56.396448 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:56.396140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:07:56.396448 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:56.396191 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:56.396448 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:56.396279 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls podName:05d00a46-2c87-449e-b9c2-6274c763b555 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:12.396239273 +0000 UTC m=+147.505294958 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w5vnd" (UID: "05d00a46-2c87-449e-b9c2-6274c763b555") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:56.396448 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:56.396281 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:56.396448 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:56.396297 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:12.39628895 +0000 UTC m=+147.505344635 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:56.396448 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:56.396320 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs podName:e7689c3e-44d1-4487-8fd4-e0ab1160ca1d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:12.39630849 +0000 UTC m=+147.505364176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs") pod "router-default-57f8d89fbb-xxxkj" (UID: "e7689c3e-44d1-4487-8fd4-e0ab1160ca1d") : secret "router-metrics-certs-default" not found Apr 20 20:07:56.751674 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:56.751639 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:56.751674 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:56.751674 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:07:56.752124 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:07:56.752101 2574 scope.go:117] "RemoveContainer" containerID="69ded6c55d22cf5fa9604e59d82d11b537fb74acd850580a5987775a35f97f95" Apr 20 20:07:56.752317 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:07:56.752299 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-blpbt_openshift-console-operator(5bf2bb25-bfba-4f17-b4fc-7607da4bb789)\"" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" podUID="5bf2bb25-bfba-4f17-b4fc-7607da4bb789" Apr 20 20:08:00.122613 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:00.122578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:08:00.122950 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:00.122710 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:08:00.122950 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:00.122725 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b5c68b54b-76d47: secret "image-registry-tls" not found Apr 20 20:08:00.122950 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:00.122788 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls podName:6f31ebc7-2994-45d9-9434-7da5efc7d6bc nodeName:}" failed. No retries permitted until 2026-04-20 20:08:08.122772606 +0000 UTC m=+143.231828294 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls") pod "image-registry-6b5c68b54b-76d47" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc") : secret "image-registry-tls" not found Apr 20 20:08:01.333030 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:01.332994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:08:01.335477 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:01.335445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d221fc13-b016-4303-a194-cc93d82ca26f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qrqt2\" (UID: \"d221fc13-b016-4303-a194-cc93d82ca26f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:08:01.364102 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:01.364079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" Apr 20 20:08:01.476244 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:01.476214 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2"] Apr 20 20:08:01.846095 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:01.846058 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" event={"ID":"d221fc13-b016-4303-a194-cc93d82ca26f","Type":"ContainerStarted","Data":"710ec4b5f1891b139a2f60d05754c7189bc9e104ab3688f78d934103fb27db9b"} Apr 20 20:08:03.852220 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:03.852188 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" event={"ID":"d221fc13-b016-4303-a194-cc93d82ca26f","Type":"ContainerStarted","Data":"96f8085c4302edf1c5e72ba69390459cd5a90b57855e610f5d1e964c90c7f5af"} Apr 20 20:08:03.852220 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:03.852220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" event={"ID":"d221fc13-b016-4303-a194-cc93d82ca26f","Type":"ContainerStarted","Data":"29d0bdd030898e783869f0693608698e84f3479cdd988bbc886c2af7f1d18f6a"} Apr 20 20:08:03.868836 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:03.868790 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qrqt2" podStartSLOduration=17.418224073 podStartE2EDuration="18.868776983s" podCreationTimestamp="2026-04-20 20:07:45 +0000 UTC" firstStartedPulling="2026-04-20 20:08:01.516761745 +0000 UTC m=+136.625817432" lastFinishedPulling="2026-04-20 20:08:02.967314656 +0000 UTC m=+138.076370342" observedRunningTime="2026-04-20 20:08:03.867607126 +0000 UTC m=+138.976662834" watchObservedRunningTime="2026-04-20 20:08:03.868776983 +0000 UTC m=+138.977832691" Apr 20 20:08:08.190889 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.190853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:08:08.193272 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.193230 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") pod \"image-registry-6b5c68b54b-76d47\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:08:08.224159 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.224119 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:08:08.341108 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.341082 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b5c68b54b-76d47"] Apr 20 20:08:08.344460 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:08.344433 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f31ebc7_2994_45d9_9434_7da5efc7d6bc.slice/crio-5a3d3dc6d28f7de879642a3e06056e4cf1aff1919040aabb6cf2f085b7aebfe1 WatchSource:0}: Error finding container 5a3d3dc6d28f7de879642a3e06056e4cf1aff1919040aabb6cf2f085b7aebfe1: Status 404 returned error can't find the container with id 5a3d3dc6d28f7de879642a3e06056e4cf1aff1919040aabb6cf2f085b7aebfe1 Apr 20 20:08:08.478893 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.478824 2574 scope.go:117] "RemoveContainer" containerID="69ded6c55d22cf5fa9604e59d82d11b537fb74acd850580a5987775a35f97f95" Apr 20 20:08:08.866437 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.866365 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:08:08.866578 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.866467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" event={"ID":"5bf2bb25-bfba-4f17-b4fc-7607da4bb789","Type":"ContainerStarted","Data":"3b367ca1e635af2bb74bce2dba99c85bb61dce27112365ed461bd83223ef8a25"} Apr 20 20:08:08.866827 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.866808 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:08:08.868112 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.868084 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" event={"ID":"6f31ebc7-2994-45d9-9434-7da5efc7d6bc","Type":"ContainerStarted","Data":"a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68"} Apr 20 20:08:08.868220 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.868119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" event={"ID":"6f31ebc7-2994-45d9-9434-7da5efc7d6bc","Type":"ContainerStarted","Data":"5a3d3dc6d28f7de879642a3e06056e4cf1aff1919040aabb6cf2f085b7aebfe1"} Apr 20 20:08:08.868333 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.868315 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:08:08.887120 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.887081 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" podStartSLOduration=21.260936887 podStartE2EDuration="22.8870695s" podCreationTimestamp="2026-04-20 20:07:46 +0000 UTC" firstStartedPulling="2026-04-20 20:07:46.871219065 +0000 UTC m=+121.980274754" lastFinishedPulling="2026-04-20 20:07:48.497351679 +0000 UTC m=+123.606407367" observedRunningTime="2026-04-20 20:08:08.886373871 +0000 UTC m=+143.995429579" watchObservedRunningTime="2026-04-20 20:08:08.8870695 +0000 UTC m=+143.996125204" Apr 20 20:08:08.905460 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:08.905426 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" podStartSLOduration=16.905416687 podStartE2EDuration="16.905416687s" podCreationTimestamp="2026-04-20 20:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:08.905383259 +0000 UTC m=+144.014438978" watchObservedRunningTime="2026-04-20 20:08:08.905416687 +0000 UTC m=+144.014472393" Apr 20 20:08:09.399518 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:09.399479 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-blpbt" Apr 20 20:08:12.422978 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.422944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:08:12.423439 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.423002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:08:12.423439 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.423157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:08:12.423564 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.423548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-service-ca-bundle\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:08:12.425585 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.425563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7689c3e-44d1-4487-8fd4-e0ab1160ca1d-metrics-certs\") pod \"router-default-57f8d89fbb-xxxkj\" (UID: \"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d\") " pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:08:12.426129 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.426108 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d00a46-2c87-449e-b9c2-6274c763b555-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w5vnd\" (UID: \"05d00a46-2c87-449e-b9c2-6274c763b555\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:08:12.432109 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.432091 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-9tpbw\"" Apr 20 20:08:12.438292 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.438276 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-94f5q\"" Apr 20 20:08:12.439990 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.439977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" Apr 20 20:08:12.446643 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.446615 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:08:12.565247 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.565221 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd"] Apr 20 20:08:12.568097 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:12.568058 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05d00a46_2c87_449e_b9c2_6274c763b555.slice/crio-5b3302f964cb83688ea9f80e65aa5a696accedde757c41b24cdcbbab06218d33 WatchSource:0}: Error finding container 5b3302f964cb83688ea9f80e65aa5a696accedde757c41b24cdcbbab06218d33: Status 404 returned error can't find the container with id 5b3302f964cb83688ea9f80e65aa5a696accedde757c41b24cdcbbab06218d33 Apr 20 20:08:12.588432 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.588412 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-57f8d89fbb-xxxkj"] Apr 20 20:08:12.591304 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:12.591237 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7689c3e_44d1_4487_8fd4_e0ab1160ca1d.slice/crio-7bef3c7a91eab3c219bf7bfc9698ef5678ae8cbd1c2e6cfb29f2cb899d711c29 WatchSource:0}: Error finding container 7bef3c7a91eab3c219bf7bfc9698ef5678ae8cbd1c2e6cfb29f2cb899d711c29: Status 404 returned error can't find the container with id 7bef3c7a91eab3c219bf7bfc9698ef5678ae8cbd1c2e6cfb29f2cb899d711c29 Apr 20 20:08:12.881219 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.881135 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" event={"ID":"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d","Type":"ContainerStarted","Data":"6edab4d0e3b12e34df25d8c5ddb3dbb84b74aee897b6b32a7d924c6187a09000"} Apr 20 20:08:12.881219 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.881171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" event={"ID":"e7689c3e-44d1-4487-8fd4-e0ab1160ca1d","Type":"ContainerStarted","Data":"7bef3c7a91eab3c219bf7bfc9698ef5678ae8cbd1c2e6cfb29f2cb899d711c29"} Apr 20 20:08:12.882178 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.882147 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" event={"ID":"05d00a46-2c87-449e-b9c2-6274c763b555","Type":"ContainerStarted","Data":"5b3302f964cb83688ea9f80e65aa5a696accedde757c41b24cdcbbab06218d33"} Apr 20 20:08:12.899944 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:12.899896 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" podStartSLOduration=32.899881869 podStartE2EDuration="32.899881869s" podCreationTimestamp="2026-04-20 20:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:12.898970759 +0000 UTC m=+148.008026466" watchObservedRunningTime="2026-04-20 20:08:12.899881869 +0000 UTC m=+148.008937575" Apr 20 20:08:13.447251 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:13.447219 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:08:13.450446 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:13.450423 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:08:13.884977 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:13.884891 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:08:13.886129 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:13.886106 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-57f8d89fbb-xxxkj" Apr 20 20:08:14.888100 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:14.888054 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" event={"ID":"05d00a46-2c87-449e-b9c2-6274c763b555","Type":"ContainerStarted","Data":"d4808385412fc7ddfee2372decbfd1996b4513fa94142d2047a90e7966efa49f"} Apr 20 20:08:14.905145 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:14.905089 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w5vnd" podStartSLOduration=32.777711179 podStartE2EDuration="34.905074079s" podCreationTimestamp="2026-04-20 20:07:40 +0000 UTC" firstStartedPulling="2026-04-20 20:08:12.569996284 +0000 UTC m=+147.679051973" lastFinishedPulling="2026-04-20 20:08:14.697359185 +0000 UTC m=+149.806414873" observedRunningTime="2026-04-20 20:08:14.904189526 +0000 UTC m=+150.013245235" watchObservedRunningTime="2026-04-20 20:08:14.905074079 +0000 UTC m=+150.014129787" Apr 20 20:08:20.944014 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:20.943989 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-q8x64"] Apr 20 20:08:20.947019 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:20.947000 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-q8x64" Apr 20 20:08:20.952250 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:20.952224 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-sv52n\"" Apr 20 20:08:20.952551 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:20.952532 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 20:08:20.952652 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:20.952533 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 20:08:20.966003 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:20.965980 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-q8x64"] Apr 20 20:08:20.983870 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:20.983844 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b5c68b54b-76d47"] Apr 20 20:08:20.985439 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:20.985410 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkjbm\" (UniqueName: \"kubernetes.io/projected/bba62270-eaf0-456e-8beb-0c7e16c16c44-kube-api-access-vkjbm\") pod \"downloads-6bcc868b7-q8x64\" (UID: \"bba62270-eaf0-456e-8beb-0c7e16c16c44\") " pod="openshift-console/downloads-6bcc868b7-q8x64" Apr 20 20:08:21.029859 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.029831 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6fb58fbcf4-lg4j5"] Apr 20 20:08:21.032829 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.032815 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.052984 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.052959 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6fb58fbcf4-lg4j5"] Apr 20 20:08:21.054072 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.054051 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9v4hs"] Apr 20 20:08:21.056986 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.056970 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.060111 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.060094 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:08:21.060191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.060155 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lxzp7\"" Apr 20 20:08:21.060689 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.060669 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:08:21.084598 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.084571 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9v4hs"] Apr 20 20:08:21.086891 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.086871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0307fec1-4aab-49c3-8677-c695c3f01c6a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.086956 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.086905 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/994894c1-acec-4952-8ffb-c3f8e2741554-bound-sa-token\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.086956 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.086946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/994894c1-acec-4952-8ffb-c3f8e2741554-registry-certificates\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.087031 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.086981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/994894c1-acec-4952-8ffb-c3f8e2741554-installation-pull-secrets\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.087031 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087001 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksv2h\" (UniqueName: \"kubernetes.io/projected/0307fec1-4aab-49c3-8677-c695c3f01c6a-kube-api-access-ksv2h\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.087094 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkjbm\" (UniqueName: \"kubernetes.io/projected/bba62270-eaf0-456e-8beb-0c7e16c16c44-kube-api-access-vkjbm\") pod \"downloads-6bcc868b7-q8x64\" (UID: \"bba62270-eaf0-456e-8beb-0c7e16c16c44\") " pod="openshift-console/downloads-6bcc868b7-q8x64" Apr 20 20:08:21.087094 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087069 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/994894c1-acec-4952-8ffb-c3f8e2741554-image-registry-private-configuration\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.087158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0307fec1-4aab-49c3-8677-c695c3f01c6a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.087158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087148 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0307fec1-4aab-49c3-8677-c695c3f01c6a-crio-socket\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.087217 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087167 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/994894c1-acec-4952-8ffb-c3f8e2741554-trusted-ca\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.087277 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0307fec1-4aab-49c3-8677-c695c3f01c6a-data-volume\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.087330 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087316 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/994894c1-acec-4952-8ffb-c3f8e2741554-registry-tls\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.087368 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl4d6\" (UniqueName: \"kubernetes.io/projected/994894c1-acec-4952-8ffb-c3f8e2741554-kube-api-access-dl4d6\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.087368 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.087361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/994894c1-acec-4952-8ffb-c3f8e2741554-ca-trust-extracted\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.114277 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.114246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkjbm\" (UniqueName: \"kubernetes.io/projected/bba62270-eaf0-456e-8beb-0c7e16c16c44-kube-api-access-vkjbm\") pod \"downloads-6bcc868b7-q8x64\" (UID: \"bba62270-eaf0-456e-8beb-0c7e16c16c44\") " pod="openshift-console/downloads-6bcc868b7-q8x64" Apr 20 20:08:21.187882 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.187806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/994894c1-acec-4952-8ffb-c3f8e2741554-registry-tls\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.187882 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.187839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl4d6\" (UniqueName: \"kubernetes.io/projected/994894c1-acec-4952-8ffb-c3f8e2741554-kube-api-access-dl4d6\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.187882 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.187856 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/994894c1-acec-4952-8ffb-c3f8e2741554-ca-trust-extracted\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.188149 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.187893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0307fec1-4aab-49c3-8677-c695c3f01c6a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.188149 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/994894c1-acec-4952-8ffb-c3f8e2741554-bound-sa-token\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.188149 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188045 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/994894c1-acec-4952-8ffb-c3f8e2741554-registry-certificates\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.188149 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/994894c1-acec-4952-8ffb-c3f8e2741554-installation-pull-secrets\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.188383 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188224 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksv2h\" (UniqueName: \"kubernetes.io/projected/0307fec1-4aab-49c3-8677-c695c3f01c6a-kube-api-access-ksv2h\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.188383 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188281 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/994894c1-acec-4952-8ffb-c3f8e2741554-image-registry-private-configuration\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.188383 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0307fec1-4aab-49c3-8677-c695c3f01c6a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.188383 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0307fec1-4aab-49c3-8677-c695c3f01c6a-crio-socket\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.188571 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188398 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/994894c1-acec-4952-8ffb-c3f8e2741554-ca-trust-extracted\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.188571 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/994894c1-acec-4952-8ffb-c3f8e2741554-trusted-ca\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.188571 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.188475 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0307fec1-4aab-49c3-8677-c695c3f01c6a-data-volume\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.189087 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.189061 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/994894c1-acec-4952-8ffb-c3f8e2741554-registry-certificates\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.189171 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.189081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0307fec1-4aab-49c3-8677-c695c3f01c6a-crio-socket\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.189492 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.189416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0307fec1-4aab-49c3-8677-c695c3f01c6a-data-volume\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.189492 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.189455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/994894c1-acec-4952-8ffb-c3f8e2741554-trusted-ca\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.189770 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.189746 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0307fec1-4aab-49c3-8677-c695c3f01c6a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.190706 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.190687 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/994894c1-acec-4952-8ffb-c3f8e2741554-registry-tls\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.190806 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.190744 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0307fec1-4aab-49c3-8677-c695c3f01c6a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.190806 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.190772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/994894c1-acec-4952-8ffb-c3f8e2741554-installation-pull-secrets\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.191110 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.191094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/994894c1-acec-4952-8ffb-c3f8e2741554-image-registry-private-configuration\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.200824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.200807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/994894c1-acec-4952-8ffb-c3f8e2741554-bound-sa-token\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.204765 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.204733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl4d6\" (UniqueName: \"kubernetes.io/projected/994894c1-acec-4952-8ffb-c3f8e2741554-kube-api-access-dl4d6\") pod \"image-registry-6fb58fbcf4-lg4j5\" (UID: \"994894c1-acec-4952-8ffb-c3f8e2741554\") " pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.204929 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.204910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksv2h\" (UniqueName: \"kubernetes.io/projected/0307fec1-4aab-49c3-8677-c695c3f01c6a-kube-api-access-ksv2h\") pod \"insights-runtime-extractor-9v4hs\" (UID: \"0307fec1-4aab-49c3-8677-c695c3f01c6a\") " pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.255103 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.255078 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-q8x64" Apr 20 20:08:21.342035 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.341808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.365315 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.365292 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9v4hs" Apr 20 20:08:21.384986 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.384950 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-q8x64"] Apr 20 20:08:21.395642 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:21.395609 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbba62270_eaf0_456e_8beb_0c7e16c16c44.slice/crio-c3c8556bb96336278dda074416c5f7d1adf36ea92c9a097cb1b27691b721aaee WatchSource:0}: Error finding container c3c8556bb96336278dda074416c5f7d1adf36ea92c9a097cb1b27691b721aaee: Status 404 returned error can't find the container with id c3c8556bb96336278dda074416c5f7d1adf36ea92c9a097cb1b27691b721aaee Apr 20 20:08:21.485937 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.485905 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6fb58fbcf4-lg4j5"] Apr 20 20:08:21.488554 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:21.488527 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994894c1_acec_4952_8ffb_c3f8e2741554.slice/crio-13a7497a590fc746073458f5bdb73b0b4c8ad48e8e537a095e0438b4af52e286 WatchSource:0}: Error finding container 13a7497a590fc746073458f5bdb73b0b4c8ad48e8e537a095e0438b4af52e286: Status 404 returned error can't find the container with id 13a7497a590fc746073458f5bdb73b0b4c8ad48e8e537a095e0438b4af52e286 Apr 20 20:08:21.506400 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.506377 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9v4hs"] Apr 20 20:08:21.517326 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:21.517303 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0307fec1_4aab_49c3_8677_c695c3f01c6a.slice/crio-a6cbc8531a29da4ca75b5b4ccd7e0de344bce76252f2eb653d97826b525aaf50 WatchSource:0}: Error finding container a6cbc8531a29da4ca75b5b4ccd7e0de344bce76252f2eb653d97826b525aaf50: Status 404 returned error can't find the container with id a6cbc8531a29da4ca75b5b4ccd7e0de344bce76252f2eb653d97826b525aaf50 Apr 20 20:08:21.798014 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:21.797906 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jxlts" podUID="a53459d8-2c1c-4399-801a-d69f56977702" Apr 20 20:08:21.819159 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:21.819120 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-q92l2" podUID="d2fc277b-80e6-4be4-a366-c742f661aa43" Apr 20 20:08:21.906664 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.906632 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9v4hs" event={"ID":"0307fec1-4aab-49c3-8677-c695c3f01c6a","Type":"ContainerStarted","Data":"35260551a84272bab8ef42a3a9183547437d742b5304db91bbb57cfb10fbce73"} Apr 20 20:08:21.906824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.906672 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9v4hs" event={"ID":"0307fec1-4aab-49c3-8677-c695c3f01c6a","Type":"ContainerStarted","Data":"a6cbc8531a29da4ca75b5b4ccd7e0de344bce76252f2eb653d97826b525aaf50"} Apr 20 20:08:21.907833 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.907808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" event={"ID":"994894c1-acec-4952-8ffb-c3f8e2741554","Type":"ContainerStarted","Data":"d5d777403c075b701356385f2a62d1e664195f316b2c575a8fa7cc02170b83b4"} Apr 20 20:08:21.907833 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.907836 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" event={"ID":"994894c1-acec-4952-8ffb-c3f8e2741554","Type":"ContainerStarted","Data":"13a7497a590fc746073458f5bdb73b0b4c8ad48e8e537a095e0438b4af52e286"} Apr 20 20:08:21.907990 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.907954 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:21.908824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.908805 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-q8x64" event={"ID":"bba62270-eaf0-456e-8beb-0c7e16c16c44","Type":"ContainerStarted","Data":"c3c8556bb96336278dda074416c5f7d1adf36ea92c9a097cb1b27691b721aaee"} Apr 20 20:08:21.908896 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.908831 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:08:21.908982 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.908969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jxlts" Apr 20 20:08:21.932334 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:21.932298 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" podStartSLOduration=0.932286519 podStartE2EDuration="932.286519ms" podCreationTimestamp="2026-04-20 20:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:21.931529538 +0000 UTC m=+157.040585246" watchObservedRunningTime="2026-04-20 20:08:21.932286519 +0000 UTC m=+157.041342217" Apr 20 20:08:22.913553 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:22.913520 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9v4hs" event={"ID":"0307fec1-4aab-49c3-8677-c695c3f01c6a","Type":"ContainerStarted","Data":"ed97a612d68cdce38a03a0e6f06c9eaa1a1e7dd8d2af8cc76f9e9b3de2103b01"} Apr 20 20:08:23.487342 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:23.487295 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kzfxf" podUID="89492d29-88c3-44e3-adc2-eda0304a1081" Apr 20 20:08:24.921565 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:24.921528 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9v4hs" event={"ID":"0307fec1-4aab-49c3-8677-c695c3f01c6a","Type":"ContainerStarted","Data":"52ecf18b67b4045a1a6a507be144496264bd5d5d8f9fc2c0ad47cbd3ea53092f"} Apr 20 20:08:24.942036 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:24.941989 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9v4hs" podStartSLOduration=1.146051952 podStartE2EDuration="3.94197467s" podCreationTimestamp="2026-04-20 20:08:21 +0000 UTC" firstStartedPulling="2026-04-20 20:08:21.559186098 +0000 UTC m=+156.668241783" lastFinishedPulling="2026-04-20 20:08:24.355108812 +0000 UTC m=+159.464164501" observedRunningTime="2026-04-20 20:08:24.940948798 +0000 UTC m=+160.050004506" watchObservedRunningTime="2026-04-20 20:08:24.94197467 +0000 UTC m=+160.051030378" Apr 20 20:08:26.738473 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:26.738434 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:08:26.738931 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:26.738530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:08:26.741491 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:26.741468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a53459d8-2c1c-4399-801a-d69f56977702-metrics-tls\") pod \"dns-default-jxlts\" (UID: \"a53459d8-2c1c-4399-801a-d69f56977702\") " pod="openshift-dns/dns-default-jxlts" Apr 20 20:08:26.741681 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:26.741659 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fc277b-80e6-4be4-a366-c742f661aa43-cert\") pod \"ingress-canary-q92l2\" (UID: \"d2fc277b-80e6-4be4-a366-c742f661aa43\") " pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:08:27.014873 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:27.014780 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wkgqm\"" Apr 20 20:08:27.015044 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:27.014883 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m22kk\"" Apr 20 20:08:27.020168 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:27.020144 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jxlts" Apr 20 20:08:27.020324 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:27.020205 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q92l2" Apr 20 20:08:27.189519 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:27.189467 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jxlts"] Apr 20 20:08:27.192236 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:27.192207 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda53459d8_2c1c_4399_801a_d69f56977702.slice/crio-11a109a954a52e48959399cbac9f94966cf356c5e65888696013dbb5b0ea1bdb WatchSource:0}: Error finding container 11a109a954a52e48959399cbac9f94966cf356c5e65888696013dbb5b0ea1bdb: Status 404 returned error can't find the container with id 11a109a954a52e48959399cbac9f94966cf356c5e65888696013dbb5b0ea1bdb Apr 20 20:08:27.197862 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:27.197840 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q92l2"] Apr 20 20:08:27.200556 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:27.200532 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2fc277b_80e6_4be4_a366_c742f661aa43.slice/crio-b713b7273a5edd6ad8b181393ba7da6204c72579f37e9f6af5a8ab70d1769c64 WatchSource:0}: Error finding container b713b7273a5edd6ad8b181393ba7da6204c72579f37e9f6af5a8ab70d1769c64: Status 404 returned error can't find the container with id b713b7273a5edd6ad8b181393ba7da6204c72579f37e9f6af5a8ab70d1769c64 Apr 20 20:08:27.931466 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:27.931430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q92l2" event={"ID":"d2fc277b-80e6-4be4-a366-c742f661aa43","Type":"ContainerStarted","Data":"b713b7273a5edd6ad8b181393ba7da6204c72579f37e9f6af5a8ab70d1769c64"} Apr 20 20:08:27.933204 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:27.933079 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jxlts" event={"ID":"a53459d8-2c1c-4399-801a-d69f56977702","Type":"ContainerStarted","Data":"11a109a954a52e48959399cbac9f94966cf356c5e65888696013dbb5b0ea1bdb"} Apr 20 20:08:28.687341 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.687308 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg"] Apr 20 20:08:28.690948 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.690929 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.694558 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.694535 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 20:08:28.694983 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.694946 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5kg58\"" Apr 20 20:08:28.695721 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.695702 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:08:28.696614 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.696442 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 20:08:28.701414 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.701374 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg"] Apr 20 20:08:28.735636 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.735606 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-stbdv"] Apr 20 20:08:28.739368 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.739344 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.740949 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.740927 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tjkrs"] Apr 20 20:08:28.741616 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.741598 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7dbd8\"" Apr 20 20:08:28.741721 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.741705 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:08:28.741932 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.741917 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:08:28.742021 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.741930 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:08:28.744469 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.744447 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.749542 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.749519 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 20:08:28.749637 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.749522 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 20:08:28.749637 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.749615 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 20:08:28.749791 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.749760 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-wn4jc\"" Apr 20 20:08:28.756645 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.756619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.756756 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.756698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.756756 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.756734 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.756870 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.756763 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppwh7\" (UniqueName: \"kubernetes.io/projected/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-kube-api-access-ppwh7\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.760571 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.760547 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tjkrs"] Apr 20 20:08:28.857419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.857601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54wzt\" (UniqueName: \"kubernetes.io/projected/78330f86-de48-4277-bc19-bbb785bcaddc-kube-api-access-54wzt\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.857601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857473 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78330f86-de48-4277-bc19-bbb785bcaddc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.857601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-accelerators-collector-config\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.857601 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:28.857529 2574 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 20:08:28.857601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857542 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.857601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7f61cc-34e8-4462-a4cb-828621ebb984-metrics-client-ca\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.857601 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:28.857596 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-openshift-state-metrics-tls podName:b936bdf7-2a5b-40cc-94f4-ef8998a14ad0 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:29.357573479 +0000 UTC m=+164.466629166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-vn6pg" (UID: "b936bdf7-2a5b-40cc-94f4-ef8998a14ad0") : secret "openshift-state-metrics-tls" not found Apr 20 20:08:28.857935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857637 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78330f86-de48-4277-bc19-bbb785bcaddc-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.857935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.857935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kv4c\" (UniqueName: \"kubernetes.io/projected/6d7f61cc-34e8-4462-a4cb-828621ebb984-kube-api-access-4kv4c\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.857935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.857935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppwh7\" (UniqueName: \"kubernetes.io/projected/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-kube-api-access-ppwh7\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.857935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857774 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/78330f86-de48-4277-bc19-bbb785bcaddc-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.857935 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.857817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/78330f86-de48-4277-bc19-bbb785bcaddc-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.858183 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.858096 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78330f86-de48-4277-bc19-bbb785bcaddc-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.858236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.858185 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d7f61cc-34e8-4462-a4cb-828621ebb984-sys\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.858305 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.858246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-textfile\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.858415 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.858305 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-wtmp\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.858415 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.858337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d7f61cc-34e8-4462-a4cb-828621ebb984-root\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.858415 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.858368 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-tls\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.858564 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.858507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.861153 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.861126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.871040 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.870997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppwh7\" (UniqueName: \"kubernetes.io/projected/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-kube-api-access-ppwh7\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:28.959108 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-textfile\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.959108 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-wtmp\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.959108 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d7f61cc-34e8-4462-a4cb-828621ebb984-root\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-tls\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959171 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54wzt\" (UniqueName: \"kubernetes.io/projected/78330f86-de48-4277-bc19-bbb785bcaddc-kube-api-access-54wzt\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78330f86-de48-4277-bc19-bbb785bcaddc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-wtmp\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959243 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-accelerators-collector-config\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7f61cc-34e8-4462-a4cb-828621ebb984-metrics-client-ca\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959371 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78330f86-de48-4277-bc19-bbb785bcaddc-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kv4c\" (UniqueName: \"kubernetes.io/projected/6d7f61cc-34e8-4462-a4cb-828621ebb984-kube-api-access-4kv4c\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/78330f86-de48-4277-bc19-bbb785bcaddc-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.959668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.959444 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-textfile\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.960160 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.960139 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-accelerators-collector-config\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.960219 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.960200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d7f61cc-34e8-4462-a4cb-828621ebb984-root\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.960285 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.960246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/78330f86-de48-4277-bc19-bbb785bcaddc-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.960358 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.960326 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/78330f86-de48-4277-bc19-bbb785bcaddc-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.960468 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.960444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78330f86-de48-4277-bc19-bbb785bcaddc-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.960529 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.960492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d7f61cc-34e8-4462-a4cb-828621ebb984-sys\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.961343 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.960984 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/78330f86-de48-4277-bc19-bbb785bcaddc-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.961343 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.961044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d7f61cc-34e8-4462-a4cb-828621ebb984-sys\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.961343 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.961160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78330f86-de48-4277-bc19-bbb785bcaddc-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.961543 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:28.961380 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:08:28.961543 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:28.961441 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-tls podName:6d7f61cc-34e8-4462-a4cb-828621ebb984 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:29.461422247 +0000 UTC m=+164.570477931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-tls") pod "node-exporter-stbdv" (UID: "6d7f61cc-34e8-4462-a4cb-828621ebb984") : secret "node-exporter-tls" not found Apr 20 20:08:28.961701 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.961675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7f61cc-34e8-4462-a4cb-828621ebb984-metrics-client-ca\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.962960 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.962916 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78330f86-de48-4277-bc19-bbb785bcaddc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.963072 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.963030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.964327 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.964288 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78330f86-de48-4277-bc19-bbb785bcaddc-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:28.976362 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.976319 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kv4c\" (UniqueName: \"kubernetes.io/projected/6d7f61cc-34e8-4462-a4cb-828621ebb984-kube-api-access-4kv4c\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:28.977592 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:28.977554 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54wzt\" (UniqueName: \"kubernetes.io/projected/78330f86-de48-4277-bc19-bbb785bcaddc-kube-api-access-54wzt\") pod \"kube-state-metrics-69db897b98-tjkrs\" (UID: \"78330f86-de48-4277-bc19-bbb785bcaddc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:29.061428 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.061391 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" Apr 20 20:08:29.366104 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.366017 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:29.369450 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.369422 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b936bdf7-2a5b-40cc-94f4-ef8998a14ad0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vn6pg\" (UID: \"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:29.468799 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.468222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-tls\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:29.468799 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:29.468490 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:08:29.468799 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:29.468556 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-tls podName:6d7f61cc-34e8-4462-a4cb-828621ebb984 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:30.468535268 +0000 UTC m=+165.577590957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-tls") pod "node-exporter-stbdv" (UID: "6d7f61cc-34e8-4462-a4cb-828621ebb984") : secret "node-exporter-tls" not found Apr 20 20:08:29.573499 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.573450 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tjkrs"] Apr 20 20:08:29.575772 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:29.575739 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78330f86_de48_4277_bc19_bbb785bcaddc.slice/crio-946188d821fec32adcf2175a9e2f4662c30b1f3ac4f03de0ef339e400eb08dda WatchSource:0}: Error finding container 946188d821fec32adcf2175a9e2f4662c30b1f3ac4f03de0ef339e400eb08dda: Status 404 returned error can't find the container with id 946188d821fec32adcf2175a9e2f4662c30b1f3ac4f03de0ef339e400eb08dda Apr 20 20:08:29.604517 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.604162 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" Apr 20 20:08:29.773601 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.773550 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg"] Apr 20 20:08:29.777344 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:29.777316 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb936bdf7_2a5b_40cc_94f4_ef8998a14ad0.slice/crio-18831a14a657647ad703404166203f2a8e373043776b2bd60d3017cee68631f4 WatchSource:0}: Error finding container 18831a14a657647ad703404166203f2a8e373043776b2bd60d3017cee68631f4: Status 404 returned error can't find the container with id 18831a14a657647ad703404166203f2a8e373043776b2bd60d3017cee68631f4 Apr 20 20:08:29.849312 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.849278 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:08:29.853676 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.853656 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.856281 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.856243 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 20:08:29.856419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.856401 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gwhrt\"" Apr 20 20:08:29.856737 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.856717 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 20:08:29.856953 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.856936 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 20:08:29.857706 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.857686 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 20:08:29.857794 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.857738 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 20:08:29.857856 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.857694 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 20:08:29.857940 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.857919 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 20:08:29.857992 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.857969 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 20:08:29.858154 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.858126 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 20:08:29.873240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.873198 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:08:29.941367 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.941336 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q92l2" event={"ID":"d2fc277b-80e6-4be4-a366-c742f661aa43","Type":"ContainerStarted","Data":"fa1eed36b0d0313fd6cdc5791533581ab68454ee54b83237aac5d8d8dd37baff"} Apr 20 20:08:29.944357 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.944093 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jxlts" event={"ID":"a53459d8-2c1c-4399-801a-d69f56977702","Type":"ContainerStarted","Data":"c3f955223241ab0fdb9d04db93fbef93a4d8ced1fcea004c93988bb8532b000f"} Apr 20 20:08:29.944357 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.944127 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jxlts" event={"ID":"a53459d8-2c1c-4399-801a-d69f56977702","Type":"ContainerStarted","Data":"994baa815113244b0e0333fb1cac1b38f4cc9b2b51cf351836e462605d78b50c"} Apr 20 20:08:29.945087 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.944956 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jxlts" Apr 20 20:08:29.946869 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.946351 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" event={"ID":"78330f86-de48-4277-bc19-bbb785bcaddc","Type":"ContainerStarted","Data":"946188d821fec32adcf2175a9e2f4662c30b1f3ac4f03de0ef339e400eb08dda"} Apr 20 20:08:29.950823 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.950782 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" event={"ID":"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0","Type":"ContainerStarted","Data":"1e30a96358be5431f223db18e68db8d100997f0bbd18e3c444919524237fb4f7"} Apr 20 20:08:29.950823 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.950809 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" event={"ID":"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0","Type":"ContainerStarted","Data":"18831a14a657647ad703404166203f2a8e373043776b2bd60d3017cee68631f4"} Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975478 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975528 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-volume\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975578 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-out\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975643 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975732 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-web-config\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975808 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkl2n\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-kube-api-access-gkl2n\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.976302 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.975834 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:29.977480 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:29.977439 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q92l2" podStartSLOduration=129.757180214 podStartE2EDuration="2m11.977425992s" podCreationTimestamp="2026-04-20 20:06:18 +0000 UTC" firstStartedPulling="2026-04-20 20:08:27.202213201 +0000 UTC m=+162.311268885" lastFinishedPulling="2026-04-20 20:08:29.42245897 +0000 UTC m=+164.531514663" observedRunningTime="2026-04-20 20:08:29.976429382 +0000 UTC m=+165.085485090" watchObservedRunningTime="2026-04-20 20:08:29.977425992 +0000 UTC m=+165.086481698" Apr 20 20:08:30.076505 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.076505 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.076754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-volume\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.076754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076581 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.076754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.076754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076622 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.076754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-out\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.076754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.076754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.076754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-web-config\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.077134 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076783 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.077134 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkl2n\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-kube-api-access-gkl2n\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.077134 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.076816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.078307 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.077959 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.078307 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.078221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.079936 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.079881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.084508 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.082957 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.084508 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.082992 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-volume\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.084508 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.083167 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-out\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.084508 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.083591 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.084508 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.083884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-web-config\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.084508 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.084496 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.084871 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.084740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.084871 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.084762 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.085427 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.085406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.093516 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.093473 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkl2n\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-kube-api-access-gkl2n\") pod \"alertmanager-main-0\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.208202 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.207702 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:30.382655 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.382596 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jxlts" podStartSLOduration=130.159202701 podStartE2EDuration="2m12.382574559s" podCreationTimestamp="2026-04-20 20:06:18 +0000 UTC" firstStartedPulling="2026-04-20 20:08:27.194489175 +0000 UTC m=+162.303544864" lastFinishedPulling="2026-04-20 20:08:29.417861028 +0000 UTC m=+164.526916722" observedRunningTime="2026-04-20 20:08:30.027053526 +0000 UTC m=+165.136109235" watchObservedRunningTime="2026-04-20 20:08:30.382574559 +0000 UTC m=+165.491630267" Apr 20 20:08:30.383028 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.383005 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:08:30.387448 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:30.387415 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd42cac2c_c2b8_4283_8162_b4b536610ab8.slice/crio-45ab978f446ee95f29be5882c6d64e8fbeaa45fd20ad17c9d1a24ddc8929967c WatchSource:0}: Error finding container 45ab978f446ee95f29be5882c6d64e8fbeaa45fd20ad17c9d1a24ddc8929967c: Status 404 returned error can't find the container with id 45ab978f446ee95f29be5882c6d64e8fbeaa45fd20ad17c9d1a24ddc8929967c Apr 20 20:08:30.481063 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.480982 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-tls\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:30.484080 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.484018 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d7f61cc-34e8-4462-a4cb-828621ebb984-node-exporter-tls\") pod \"node-exporter-stbdv\" (UID: \"6d7f61cc-34e8-4462-a4cb-828621ebb984\") " pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:30.553766 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.553530 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-stbdv" Apr 20 20:08:30.955940 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.955899 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerStarted","Data":"45ab978f446ee95f29be5882c6d64e8fbeaa45fd20ad17c9d1a24ddc8929967c"} Apr 20 20:08:30.958592 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.958542 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" event={"ID":"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0","Type":"ContainerStarted","Data":"f036f089dbe84459a544a0ade05a7b0b24b365bd340de0cb165edb1dcfde841e"} Apr 20 20:08:30.990753 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:30.990729 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:08:31.196848 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:31.196814 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d7f61cc_34e8_4462_a4cb_828621ebb984.slice/crio-ecff816e4e918ad8c7712dc1d70e242871b8c25cc9254b432f28536cd23cc628 WatchSource:0}: Error finding container ecff816e4e918ad8c7712dc1d70e242871b8c25cc9254b432f28536cd23cc628: Status 404 returned error can't find the container with id ecff816e4e918ad8c7712dc1d70e242871b8c25cc9254b432f28536cd23cc628 Apr 20 20:08:31.965205 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:31.965111 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" event={"ID":"78330f86-de48-4277-bc19-bbb785bcaddc","Type":"ContainerStarted","Data":"8362318e5a86668a07267bc19cda5f836f4ca2889a73d3861635fc72bd3dd85c"} Apr 20 20:08:31.965205 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:31.965152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" event={"ID":"78330f86-de48-4277-bc19-bbb785bcaddc","Type":"ContainerStarted","Data":"150cad425ee65446534aa51ac99c73efc314ea714a744466e6a709c87b913465"} Apr 20 20:08:31.965205 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:31.965169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" event={"ID":"78330f86-de48-4277-bc19-bbb785bcaddc","Type":"ContainerStarted","Data":"1c3cd504abffe1a178638fc74e79fb31eba1299303cceed47a1bf3eb96e0f7b2"} Apr 20 20:08:31.968778 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:31.968666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" event={"ID":"b936bdf7-2a5b-40cc-94f4-ef8998a14ad0","Type":"ContainerStarted","Data":"8bc82bdd292ef7737919205c7de69fee1279b09d54a58d28ea67464c310ceb70"} Apr 20 20:08:31.970307 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:31.970279 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-stbdv" event={"ID":"6d7f61cc-34e8-4462-a4cb-828621ebb984","Type":"ContainerStarted","Data":"ecff816e4e918ad8c7712dc1d70e242871b8c25cc9254b432f28536cd23cc628"} Apr 20 20:08:31.989440 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:31.989393 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-tjkrs" podStartSLOduration=2.373547605 podStartE2EDuration="3.989376208s" podCreationTimestamp="2026-04-20 20:08:28 +0000 UTC" firstStartedPulling="2026-04-20 20:08:29.577926855 +0000 UTC m=+164.686982552" lastFinishedPulling="2026-04-20 20:08:31.193755455 +0000 UTC m=+166.302811155" observedRunningTime="2026-04-20 20:08:31.987283048 +0000 UTC m=+167.096338756" watchObservedRunningTime="2026-04-20 20:08:31.989376208 +0000 UTC m=+167.098431932" Apr 20 20:08:32.013761 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:32.013682 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vn6pg" podStartSLOduration=2.763975749 podStartE2EDuration="4.013667132s" podCreationTimestamp="2026-04-20 20:08:28 +0000 UTC" firstStartedPulling="2026-04-20 20:08:29.947467388 +0000 UTC m=+165.056523088" lastFinishedPulling="2026-04-20 20:08:31.197158771 +0000 UTC m=+166.306214471" observedRunningTime="2026-04-20 20:08:32.010733688 +0000 UTC m=+167.119789397" watchObservedRunningTime="2026-04-20 20:08:32.013667132 +0000 UTC m=+167.122722839" Apr 20 20:08:32.975897 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:32.975723 2574 generic.go:358] "Generic (PLEG): container finished" podID="6d7f61cc-34e8-4462-a4cb-828621ebb984" containerID="75381204f6a885ef728aafce7fd6c0525746a9f53a95a82b93201378b7084f47" exitCode=0 Apr 20 20:08:32.975897 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:32.975842 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-stbdv" event={"ID":"6d7f61cc-34e8-4462-a4cb-828621ebb984","Type":"ContainerDied","Data":"75381204f6a885ef728aafce7fd6c0525746a9f53a95a82b93201378b7084f47"} Apr 20 20:08:32.977461 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:32.977429 2574 generic.go:358] "Generic (PLEG): container finished" podID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerID="e859c6cef4c9d49e47128b615f9c0e884b8e32d012d582112ca1a0fd168d53c8" exitCode=0 Apr 20 20:08:32.977582 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:32.977509 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerDied","Data":"e859c6cef4c9d49e47128b615f9c0e884b8e32d012d582112ca1a0fd168d53c8"} Apr 20 20:08:34.018787 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.018755 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5897f45db7-csbkg"] Apr 20 20:08:34.025517 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.025496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.030559 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.030536 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 20:08:34.030710 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.030682 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 20:08:34.030799 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.030744 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 20:08:34.030861 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.030826 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 20:08:34.031013 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.030997 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-fskr6\"" Apr 20 20:08:34.031598 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.031569 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 20:08:34.044562 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.043564 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 20:08:34.045580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.045555 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5897f45db7-csbkg"] Apr 20 20:08:34.116884 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.116852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwmt\" (UniqueName: \"kubernetes.io/projected/8214ca68-6496-4e90-ad00-c38797eedbf1-kube-api-access-mlwmt\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.117042 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.116920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-telemeter-client-tls\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.117042 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.116950 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-secret-telemeter-client\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.117042 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.117009 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8214ca68-6496-4e90-ad00-c38797eedbf1-serving-certs-ca-bundle\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.117231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.117041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.117231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.117115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-federate-client-tls\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.117231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.117147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8214ca68-6496-4e90-ad00-c38797eedbf1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.117231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.117179 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8214ca68-6496-4e90-ad00-c38797eedbf1-metrics-client-ca\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.218298 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.218239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-federate-client-tls\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.218554 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.218311 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8214ca68-6496-4e90-ad00-c38797eedbf1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.218554 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.218344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8214ca68-6496-4e90-ad00-c38797eedbf1-metrics-client-ca\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.218554 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.218397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwmt\" (UniqueName: \"kubernetes.io/projected/8214ca68-6496-4e90-ad00-c38797eedbf1-kube-api-access-mlwmt\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.218554 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.218482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-telemeter-client-tls\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.218554 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.218512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-secret-telemeter-client\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.218554 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.218540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8214ca68-6496-4e90-ad00-c38797eedbf1-serving-certs-ca-bundle\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.218856 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.218579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.219370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.219317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8214ca68-6496-4e90-ad00-c38797eedbf1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.219370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.219337 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8214ca68-6496-4e90-ad00-c38797eedbf1-metrics-client-ca\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.219844 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.219820 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8214ca68-6496-4e90-ad00-c38797eedbf1-serving-certs-ca-bundle\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.221231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.221208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-federate-client-tls\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.221364 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.221241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.221557 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.221540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-secret-telemeter-client\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.222031 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.222011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8214ca68-6496-4e90-ad00-c38797eedbf1-telemeter-client-tls\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.229873 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.229845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwmt\" (UniqueName: \"kubernetes.io/projected/8214ca68-6496-4e90-ad00-c38797eedbf1-kube-api-access-mlwmt\") pod \"telemeter-client-5897f45db7-csbkg\" (UID: \"8214ca68-6496-4e90-ad00-c38797eedbf1\") " pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.342822 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.342735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" Apr 20 20:08:34.479135 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:34.479101 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:08:39.610187 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:39.610162 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5897f45db7-csbkg"] Apr 20 20:08:39.621071 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:08:39.621045 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8214ca68_6496_4e90_ad00_c38797eedbf1.slice/crio-3003e9b757a90646e6c0e6ba3a128e3109fea1aaf99023a4f7f201e86633f42d WatchSource:0}: Error finding container 3003e9b757a90646e6c0e6ba3a128e3109fea1aaf99023a4f7f201e86633f42d: Status 404 returned error can't find the container with id 3003e9b757a90646e6c0e6ba3a128e3109fea1aaf99023a4f7f201e86633f42d Apr 20 20:08:40.003508 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:40.003427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-q8x64" event={"ID":"bba62270-eaf0-456e-8beb-0c7e16c16c44","Type":"ContainerStarted","Data":"5685e30107d69a1c339b64bd45117d1faef7dc4ef1030b96d7ed8769321dbf46"} Apr 20 20:08:40.004113 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:40.003952 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-q8x64" Apr 20 20:08:40.008379 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:40.008345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-stbdv" event={"ID":"6d7f61cc-34e8-4462-a4cb-828621ebb984","Type":"ContainerStarted","Data":"d56f341701cb48a102b6ece4e203a8e010e6bd5a8babe9796fc5c0aade08782f"} Apr 20 20:08:40.008489 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:40.008413 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-stbdv" event={"ID":"6d7f61cc-34e8-4462-a4cb-828621ebb984","Type":"ContainerStarted","Data":"36478fbd7b21364a58eddb786662ab3afa85f3723cc191b32777f60ea7bb4d3d"} Apr 20 20:08:40.009511 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:40.009479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" event={"ID":"8214ca68-6496-4e90-ad00-c38797eedbf1","Type":"ContainerStarted","Data":"3003e9b757a90646e6c0e6ba3a128e3109fea1aaf99023a4f7f201e86633f42d"} Apr 20 20:08:40.024408 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:40.024334 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-q8x64" podStartSLOduration=1.845991194 podStartE2EDuration="20.02429499s" podCreationTimestamp="2026-04-20 20:08:20 +0000 UTC" firstStartedPulling="2026-04-20 20:08:21.397639821 +0000 UTC m=+156.506695521" lastFinishedPulling="2026-04-20 20:08:39.575943625 +0000 UTC m=+174.684999317" observedRunningTime="2026-04-20 20:08:40.022727357 +0000 UTC m=+175.131783066" watchObservedRunningTime="2026-04-20 20:08:40.02429499 +0000 UTC m=+175.133350698" Apr 20 20:08:40.024888 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:40.024869 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-q8x64" Apr 20 20:08:40.044625 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:40.043499 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-stbdv" podStartSLOduration=10.96759005 podStartE2EDuration="12.043483674s" podCreationTimestamp="2026-04-20 20:08:28 +0000 UTC" firstStartedPulling="2026-04-20 20:08:31.205925704 +0000 UTC m=+166.314981394" lastFinishedPulling="2026-04-20 20:08:32.28181932 +0000 UTC m=+167.390875018" observedRunningTime="2026-04-20 20:08:40.042047707 +0000 UTC m=+175.151103441" watchObservedRunningTime="2026-04-20 20:08:40.043483674 +0000 UTC m=+175.152539382" Apr 20 20:08:40.973623 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:40.973534 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jxlts" Apr 20 20:08:41.015874 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:41.015823 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerStarted","Data":"37f69aad6b306341890b411e3e0c09fe9d0f7d087c9e3e7f90614c6a7d78a264"} Apr 20 20:08:42.021330 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:42.021295 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" event={"ID":"8214ca68-6496-4e90-ad00-c38797eedbf1","Type":"ContainerStarted","Data":"7a0a8d3d8aa9e2477933bbde36477f574a9078ab9f0370a2f36839621d0b3bcd"} Apr 20 20:08:42.024209 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:42.024173 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerStarted","Data":"99404cd2b08572921c4acae03ec7dd9caa9250f494b33dc558bf1e68dd96469b"} Apr 20 20:08:42.024348 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:42.024211 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerStarted","Data":"a81bc525f2c1fcf6f7c81c0b7ea33478935b534ff54fb18b48e45344ec1a0c1d"} Apr 20 20:08:42.024348 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:42.024225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerStarted","Data":"8dcdbbbda5246b6216152a8cdea90d983631a8fb0738e33db1247c1c937ec0bf"} Apr 20 20:08:42.918553 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:42.918519 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6fb58fbcf4-lg4j5" Apr 20 20:08:43.031077 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:43.031034 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" event={"ID":"8214ca68-6496-4e90-ad00-c38797eedbf1","Type":"ContainerStarted","Data":"eafe83a9aa4b9255dfc3b05c2eb650dfce16b9ed34eefecb5979baadda478c6b"} Apr 20 20:08:43.031505 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:43.031085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" event={"ID":"8214ca68-6496-4e90-ad00-c38797eedbf1","Type":"ContainerStarted","Data":"b2326f0f46bed52d1665fa056d8b745301cbc813c90929f30c05b1b291dd2be5"} Apr 20 20:08:43.034491 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:43.034464 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerStarted","Data":"41b59b6df72550cff3ba0e5f285e7373df948192c461381edeadd10f12c7a8c5"} Apr 20 20:08:43.053878 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:43.053791 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5897f45db7-csbkg" podStartSLOduration=7.881824864 podStartE2EDuration="10.053777259s" podCreationTimestamp="2026-04-20 20:08:33 +0000 UTC" firstStartedPulling="2026-04-20 20:08:39.623122241 +0000 UTC m=+174.732177928" lastFinishedPulling="2026-04-20 20:08:41.795074638 +0000 UTC m=+176.904130323" observedRunningTime="2026-04-20 20:08:43.052134165 +0000 UTC m=+178.161189872" watchObservedRunningTime="2026-04-20 20:08:43.053777259 +0000 UTC m=+178.162832971" Apr 20 20:08:44.042022 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:44.041980 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerStarted","Data":"ffa35def9acb87edf663226a6dd6670b4624bebcec5f16233e6f0faddd528965"} Apr 20 20:08:44.122275 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:44.122180 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.059858571 podStartE2EDuration="15.122160498s" podCreationTimestamp="2026-04-20 20:08:29 +0000 UTC" firstStartedPulling="2026-04-20 20:08:30.390295535 +0000 UTC m=+165.499351236" lastFinishedPulling="2026-04-20 20:08:43.452597464 +0000 UTC m=+178.561653163" observedRunningTime="2026-04-20 20:08:44.119624259 +0000 UTC m=+179.228679978" watchObservedRunningTime="2026-04-20 20:08:44.122160498 +0000 UTC m=+179.231216213" Apr 20 20:08:46.002875 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.002800 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" podUID="6f31ebc7-2994-45d9-9434-7da5efc7d6bc" containerName="registry" containerID="cri-o://a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68" gracePeriod=30 Apr 20 20:08:46.270124 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.270089 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:08:46.445611 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.445579 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-installation-pull-secrets\") pod \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " Apr 20 20:08:46.445824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.445634 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-ca-trust-extracted\") pod \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " Apr 20 20:08:46.445824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.445669 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-certificates\") pod \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " Apr 20 20:08:46.445824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.445730 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-trusted-ca\") pod \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " Apr 20 20:08:46.445824 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.445816 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-bound-sa-token\") pod \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " Apr 20 20:08:46.446027 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.445851 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") pod \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " Apr 20 20:08:46.446027 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.445875 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lwkz\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-kube-api-access-2lwkz\") pod \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " Apr 20 20:08:46.446027 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.445909 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-image-registry-private-configuration\") pod \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\" (UID: \"6f31ebc7-2994-45d9-9434-7da5efc7d6bc\") " Apr 20 20:08:46.446183 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.446106 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6f31ebc7-2994-45d9-9434-7da5efc7d6bc" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:08:46.446243 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.446183 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6f31ebc7-2994-45d9-9434-7da5efc7d6bc" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:08:46.446336 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.446243 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-certificates\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:08:46.448478 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.448446 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6f31ebc7-2994-45d9-9434-7da5efc7d6bc" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:08:46.448478 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.448469 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6f31ebc7-2994-45d9-9434-7da5efc7d6bc" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:08:46.448663 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.448561 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6f31ebc7-2994-45d9-9434-7da5efc7d6bc" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:08:46.449020 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.448981 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "6f31ebc7-2994-45d9-9434-7da5efc7d6bc" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:08:46.449135 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.449109 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-kube-api-access-2lwkz" (OuterVolumeSpecName: "kube-api-access-2lwkz") pod "6f31ebc7-2994-45d9-9434-7da5efc7d6bc" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc"). InnerVolumeSpecName "kube-api-access-2lwkz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:08:46.456464 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.456436 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6f31ebc7-2994-45d9-9434-7da5efc7d6bc" (UID: "6f31ebc7-2994-45d9-9434-7da5efc7d6bc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:08:46.547214 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.547147 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-trusted-ca\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:08:46.547214 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.547181 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-bound-sa-token\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:08:46.547214 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.547198 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-registry-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:08:46.547214 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.547211 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2lwkz\" (UniqueName: \"kubernetes.io/projected/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-kube-api-access-2lwkz\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:08:46.547516 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.547230 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-image-registry-private-configuration\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:08:46.547516 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.547244 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-installation-pull-secrets\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:08:46.547516 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:46.547253 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6f31ebc7-2994-45d9-9434-7da5efc7d6bc-ca-trust-extracted\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:08:47.053095 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.053050 2574 generic.go:358] "Generic (PLEG): container finished" podID="6f31ebc7-2994-45d9-9434-7da5efc7d6bc" containerID="a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68" exitCode=0 Apr 20 20:08:47.053687 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.053120 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" Apr 20 20:08:47.053687 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.053128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" event={"ID":"6f31ebc7-2994-45d9-9434-7da5efc7d6bc","Type":"ContainerDied","Data":"a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68"} Apr 20 20:08:47.053687 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.053164 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b5c68b54b-76d47" event={"ID":"6f31ebc7-2994-45d9-9434-7da5efc7d6bc","Type":"ContainerDied","Data":"5a3d3dc6d28f7de879642a3e06056e4cf1aff1919040aabb6cf2f085b7aebfe1"} Apr 20 20:08:47.053687 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.053184 2574 scope.go:117] "RemoveContainer" containerID="a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68" Apr 20 20:08:47.062879 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.062855 2574 scope.go:117] "RemoveContainer" containerID="a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68" Apr 20 20:08:47.063204 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:08:47.063175 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68\": container with ID starting with a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68 not found: ID does not exist" containerID="a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68" Apr 20 20:08:47.063307 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.063215 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68"} err="failed to get container status \"a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68\": rpc error: code = NotFound desc = could not find container \"a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68\": container with ID starting with a7497726f5b915630d96d5ab21c45f532d46976c529d243b22837715b5d6ae68 not found: ID does not exist" Apr 20 20:08:47.085320 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.085295 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b5c68b54b-76d47"] Apr 20 20:08:47.087951 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.087927 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6b5c68b54b-76d47"] Apr 20 20:08:47.483500 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:08:47.483465 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f31ebc7-2994-45d9-9434-7da5efc7d6bc" path="/var/lib/kubelet/pods/6f31ebc7-2994-45d9-9434-7da5efc7d6bc/volumes" Apr 20 20:09:09.119812 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:09.119778 2574 generic.go:358] "Generic (PLEG): container finished" podID="fe6cd082-894a-48c8-a261-a5354bd183fe" containerID="172ab259bb810cef25062413c45c1896083ec2e75dd8b4457c4e9b7effa20678" exitCode=0 Apr 20 20:09:09.120224 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:09.119843 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" event={"ID":"fe6cd082-894a-48c8-a261-a5354bd183fe","Type":"ContainerDied","Data":"172ab259bb810cef25062413c45c1896083ec2e75dd8b4457c4e9b7effa20678"} Apr 20 20:09:09.120224 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:09.120133 2574 scope.go:117] "RemoveContainer" containerID="172ab259bb810cef25062413c45c1896083ec2e75dd8b4457c4e9b7effa20678" Apr 20 20:09:10.124925 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:10.124888 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vrmgq" event={"ID":"fe6cd082-894a-48c8-a261-a5354bd183fe","Type":"ContainerStarted","Data":"8fe3d2209467b504695ef489b3bb7af8f113238676e57bf65e2220d655de3a32"} Apr 20 20:09:14.138462 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:14.138382 2574 generic.go:358] "Generic (PLEG): container finished" podID="48d8dc00-2133-4e50-9e06-45cb14a568c8" containerID="397f882571ad01c6ed3fdd5d22c4d554e43494a7b5a07092e76c98b62cfc0030" exitCode=0 Apr 20 20:09:14.138798 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:14.138463 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7pjcj" event={"ID":"48d8dc00-2133-4e50-9e06-45cb14a568c8","Type":"ContainerDied","Data":"397f882571ad01c6ed3fdd5d22c4d554e43494a7b5a07092e76c98b62cfc0030"} Apr 20 20:09:14.138846 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:14.138804 2574 scope.go:117] "RemoveContainer" containerID="397f882571ad01c6ed3fdd5d22c4d554e43494a7b5a07092e76c98b62cfc0030" Apr 20 20:09:15.143240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:15.143202 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7pjcj" event={"ID":"48d8dc00-2133-4e50-9e06-45cb14a568c8","Type":"ContainerStarted","Data":"d0396c351ff4205207d75ec12bcc651bf0501ca7045915d6ae98adc673a370ac"} Apr 20 20:09:49.124710 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.124678 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:49.125143 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.125075 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="alertmanager" containerID="cri-o://37f69aad6b306341890b411e3e0c09fe9d0f7d087c9e3e7f90614c6a7d78a264" gracePeriod=120 Apr 20 20:09:49.125230 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.125176 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy-web" containerID="cri-o://a81bc525f2c1fcf6f7c81c0b7ea33478935b534ff54fb18b48e45344ec1a0c1d" gracePeriod=120 Apr 20 20:09:49.125313 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.125201 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="config-reloader" containerID="cri-o://8dcdbbbda5246b6216152a8cdea90d983631a8fb0738e33db1247c1c937ec0bf" gracePeriod=120 Apr 20 20:09:49.125313 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.125177 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy-metric" containerID="cri-o://41b59b6df72550cff3ba0e5f285e7373df948192c461381edeadd10f12c7a8c5" gracePeriod=120 Apr 20 20:09:49.125412 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.125213 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy" containerID="cri-o://99404cd2b08572921c4acae03ec7dd9caa9250f494b33dc558bf1e68dd96469b" gracePeriod=120 Apr 20 20:09:49.125412 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.125230 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="prom-label-proxy" containerID="cri-o://ffa35def9acb87edf663226a6dd6670b4624bebcec5f16233e6f0faddd528965" gracePeriod=120 Apr 20 20:09:49.252811 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.252783 2574 generic.go:358] "Generic (PLEG): container finished" podID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerID="ffa35def9acb87edf663226a6dd6670b4624bebcec5f16233e6f0faddd528965" exitCode=0 Apr 20 20:09:49.252811 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.252812 2574 generic.go:358] "Generic (PLEG): container finished" podID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerID="99404cd2b08572921c4acae03ec7dd9caa9250f494b33dc558bf1e68dd96469b" exitCode=0 Apr 20 20:09:49.252955 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.252821 2574 generic.go:358] "Generic (PLEG): container finished" podID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerID="8dcdbbbda5246b6216152a8cdea90d983631a8fb0738e33db1247c1c937ec0bf" exitCode=0 Apr 20 20:09:49.252955 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.252829 2574 generic.go:358] "Generic (PLEG): container finished" podID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerID="37f69aad6b306341890b411e3e0c09fe9d0f7d087c9e3e7f90614c6a7d78a264" exitCode=0 Apr 20 20:09:49.252955 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.252848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerDied","Data":"ffa35def9acb87edf663226a6dd6670b4624bebcec5f16233e6f0faddd528965"} Apr 20 20:09:49.252955 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.252879 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerDied","Data":"99404cd2b08572921c4acae03ec7dd9caa9250f494b33dc558bf1e68dd96469b"} Apr 20 20:09:49.252955 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.252889 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerDied","Data":"8dcdbbbda5246b6216152a8cdea90d983631a8fb0738e33db1247c1c937ec0bf"} Apr 20 20:09:49.252955 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:49.252899 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerDied","Data":"37f69aad6b306341890b411e3e0c09fe9d0f7d087c9e3e7f90614c6a7d78a264"} Apr 20 20:09:50.259929 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.259903 2574 generic.go:358] "Generic (PLEG): container finished" podID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerID="41b59b6df72550cff3ba0e5f285e7373df948192c461381edeadd10f12c7a8c5" exitCode=0 Apr 20 20:09:50.259929 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.259925 2574 generic.go:358] "Generic (PLEG): container finished" podID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerID="a81bc525f2c1fcf6f7c81c0b7ea33478935b534ff54fb18b48e45344ec1a0c1d" exitCode=0 Apr 20 20:09:50.260312 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.259975 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerDied","Data":"41b59b6df72550cff3ba0e5f285e7373df948192c461381edeadd10f12c7a8c5"} Apr 20 20:09:50.260312 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.260007 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerDied","Data":"a81bc525f2c1fcf6f7c81c0b7ea33478935b534ff54fb18b48e45344ec1a0c1d"} Apr 20 20:09:50.366627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.366605 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:50.460168 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460094 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-web\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460168 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460142 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkl2n\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-kube-api-access-gkl2n\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460168 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460163 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-tls-assets\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460212 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-main-tls\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460255 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-volume\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460304 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-main-db\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460328 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460358 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-metrics-client-ca\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460385 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-cluster-tls-config\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460415 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-web-config\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460459 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-out\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460518 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.460550 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.460553 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-trusted-ca-bundle\") pod \"d42cac2c-c2b8-4283-8162-b4b536610ab8\" (UID: \"d42cac2c-c2b8-4283-8162-b4b536610ab8\") " Apr 20 20:09:50.461642 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.461355 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:50.462076 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.462053 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:50.462358 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.462332 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:50.463055 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.463025 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-volume" (OuterVolumeSpecName: "config-volume") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:50.463531 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.463505 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:50.463631 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.463596 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:50.463829 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.463793 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-kube-api-access-gkl2n" (OuterVolumeSpecName: "kube-api-access-gkl2n") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "kube-api-access-gkl2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:50.464011 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.463985 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:50.464204 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.464165 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:50.464304 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.464241 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-out" (OuterVolumeSpecName: "config-out") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:50.464916 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.464891 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:50.468647 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.468619 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:50.475387 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.475366 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-web-config" (OuterVolumeSpecName: "web-config") pod "d42cac2c-c2b8-4283-8162-b4b536610ab8" (UID: "d42cac2c-c2b8-4283-8162-b4b536610ab8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:50.562328 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562297 2574 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-volume\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562328 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562327 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-main-db\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562338 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562348 2574 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-metrics-client-ca\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562356 2574 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-cluster-tls-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562365 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-web-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562373 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d42cac2c-c2b8-4283-8162-b4b536610ab8-config-out\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562381 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562390 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d42cac2c-c2b8-4283-8162-b4b536610ab8-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562399 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562408 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gkl2n\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-kube-api-access-gkl2n\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562416 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d42cac2c-c2b8-4283-8162-b4b536610ab8-tls-assets\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:50.562441 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:50.562425 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d42cac2c-c2b8-4283-8162-b4b536610ab8-secret-alertmanager-main-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:09:51.265283 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.265237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d42cac2c-c2b8-4283-8162-b4b536610ab8","Type":"ContainerDied","Data":"45ab978f446ee95f29be5882c6d64e8fbeaa45fd20ad17c9d1a24ddc8929967c"} Apr 20 20:09:51.265622 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.265302 2574 scope.go:117] "RemoveContainer" containerID="ffa35def9acb87edf663226a6dd6670b4624bebcec5f16233e6f0faddd528965" Apr 20 20:09:51.265622 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.265395 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.272921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.272906 2574 scope.go:117] "RemoveContainer" containerID="41b59b6df72550cff3ba0e5f285e7373df948192c461381edeadd10f12c7a8c5" Apr 20 20:09:51.280032 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.280016 2574 scope.go:117] "RemoveContainer" containerID="99404cd2b08572921c4acae03ec7dd9caa9250f494b33dc558bf1e68dd96469b" Apr 20 20:09:51.287395 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.287379 2574 scope.go:117] "RemoveContainer" containerID="a81bc525f2c1fcf6f7c81c0b7ea33478935b534ff54fb18b48e45344ec1a0c1d" Apr 20 20:09:51.287647 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.287627 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:51.291370 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.291349 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:51.294159 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.294144 2574 scope.go:117] "RemoveContainer" containerID="8dcdbbbda5246b6216152a8cdea90d983631a8fb0738e33db1247c1c937ec0bf" Apr 20 20:09:51.300383 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.300368 2574 scope.go:117] "RemoveContainer" containerID="37f69aad6b306341890b411e3e0c09fe9d0f7d087c9e3e7f90614c6a7d78a264" Apr 20 20:09:51.306500 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.306485 2574 scope.go:117] "RemoveContainer" containerID="e859c6cef4c9d49e47128b615f9c0e884b8e32d012d582112ca1a0fd168d53c8" Apr 20 20:09:51.319936 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.319917 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:51.320205 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320192 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy-web" Apr 20 20:09:51.320250 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320207 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy-web" Apr 20 20:09:51.320250 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320217 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="alertmanager" Apr 20 20:09:51.320250 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320223 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="alertmanager" Apr 20 20:09:51.320250 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320230 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="config-reloader" Apr 20 20:09:51.320250 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320235 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="config-reloader" Apr 20 20:09:51.320250 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320247 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f31ebc7-2994-45d9-9434-7da5efc7d6bc" containerName="registry" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320254 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f31ebc7-2994-45d9-9434-7da5efc7d6bc" containerName="registry" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320342 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="init-config-reloader" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320349 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="init-config-reloader" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320360 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="prom-label-proxy" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320365 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="prom-label-proxy" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320372 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy-metric" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320377 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy-metric" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320384 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320390 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320447 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="prom-label-proxy" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320455 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy-metric" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320462 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="alertmanager" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320468 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320473 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="kube-rbac-proxy-web" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320479 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" containerName="config-reloader" Apr 20 20:09:51.320484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.320485 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f31ebc7-2994-45d9-9434-7da5efc7d6bc" containerName="registry" Apr 20 20:09:51.324768 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.324753 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.327003 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.326970 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gwhrt\"" Apr 20 20:09:51.327003 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.326970 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 20:09:51.327194 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.326974 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 20:09:51.327194 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.326974 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 20:09:51.327429 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.327412 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 20:09:51.327429 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.327420 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 20:09:51.327547 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.327428 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 20:09:51.327547 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.327412 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 20:09:51.327547 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.327424 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 20:09:51.334199 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.334176 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 20:09:51.338135 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.338112 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:51.468620 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efc7d93f-75ee-4556-990b-07aebff23d49-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.468620 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.468620 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.468832 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468639 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.468832 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/efc7d93f-75ee-4556-990b-07aebff23d49-tls-assets\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.468832 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/efc7d93f-75ee-4556-990b-07aebff23d49-config-out\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.468832 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468733 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc7d93f-75ee-4556-990b-07aebff23d49-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.468832 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-config-volume\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.468832 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.469006 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468859 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62hdj\" (UniqueName: \"kubernetes.io/projected/efc7d93f-75ee-4556-990b-07aebff23d49-kube-api-access-62hdj\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.469006 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-web-config\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.469006 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468921 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/efc7d93f-75ee-4556-990b-07aebff23d49-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.469006 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.468953 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.482758 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.482730 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d42cac2c-c2b8-4283-8162-b4b536610ab8" path="/var/lib/kubelet/pods/d42cac2c-c2b8-4283-8162-b4b536610ab8/volumes" Apr 20 20:09:51.569915 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.569890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570064 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.569921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efc7d93f-75ee-4556-990b-07aebff23d49-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570064 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.569941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570064 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.569992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570242 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570141 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570242 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/efc7d93f-75ee-4556-990b-07aebff23d49-tls-assets\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570242 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/efc7d93f-75ee-4556-990b-07aebff23d49-config-out\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570434 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc7d93f-75ee-4556-990b-07aebff23d49-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570434 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-config-volume\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570535 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570446 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570535 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62hdj\" (UniqueName: \"kubernetes.io/projected/efc7d93f-75ee-4556-990b-07aebff23d49-kube-api-access-62hdj\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570630 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-web-config\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570630 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/efc7d93f-75ee-4556-990b-07aebff23d49-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570787 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570737 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efc7d93f-75ee-4556-990b-07aebff23d49-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.570894 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.570874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/efc7d93f-75ee-4556-990b-07aebff23d49-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.572733 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.572685 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc7d93f-75ee-4556-990b-07aebff23d49-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.573679 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.573633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.573796 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.573736 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.573796 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.573771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.573796 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.573767 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-config-volume\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.573932 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.573838 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/efc7d93f-75ee-4556-990b-07aebff23d49-config-out\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.574310 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.574292 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/efc7d93f-75ee-4556-990b-07aebff23d49-tls-assets\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.574656 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.574637 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.575070 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.575052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.575141 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.575082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/efc7d93f-75ee-4556-990b-07aebff23d49-web-config\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.579615 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.579596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62hdj\" (UniqueName: \"kubernetes.io/projected/efc7d93f-75ee-4556-990b-07aebff23d49-kube-api-access-62hdj\") pod \"alertmanager-main-0\" (UID: \"efc7d93f-75ee-4556-990b-07aebff23d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.636779 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.636755 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:51.765136 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:51.765067 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:51.768290 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:09:51.768245 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefc7d93f_75ee_4556_990b_07aebff23d49.slice/crio-a10d8fe5a322d90f36ca64a9a83e18b172d76807796319223b1c3a5652d4819d WatchSource:0}: Error finding container a10d8fe5a322d90f36ca64a9a83e18b172d76807796319223b1c3a5652d4819d: Status 404 returned error can't find the container with id a10d8fe5a322d90f36ca64a9a83e18b172d76807796319223b1c3a5652d4819d Apr 20 20:09:52.269603 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:52.269566 2574 generic.go:358] "Generic (PLEG): container finished" podID="efc7d93f-75ee-4556-990b-07aebff23d49" containerID="e540ee509845c0e17b48bc1330a8f65abefc7fa4329f78e57ea6f77aef37facb" exitCode=0 Apr 20 20:09:52.269950 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:52.269613 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"efc7d93f-75ee-4556-990b-07aebff23d49","Type":"ContainerDied","Data":"e540ee509845c0e17b48bc1330a8f65abefc7fa4329f78e57ea6f77aef37facb"} Apr 20 20:09:52.269950 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:52.269631 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"efc7d93f-75ee-4556-990b-07aebff23d49","Type":"ContainerStarted","Data":"a10d8fe5a322d90f36ca64a9a83e18b172d76807796319223b1c3a5652d4819d"} Apr 20 20:09:53.275814 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:53.275783 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"efc7d93f-75ee-4556-990b-07aebff23d49","Type":"ContainerStarted","Data":"d366fcdcf9748cdfeee5a984451b7d98f513558db61027806a2529354bd474e5"} Apr 20 20:09:53.275814 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:53.275815 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"efc7d93f-75ee-4556-990b-07aebff23d49","Type":"ContainerStarted","Data":"7c9d6406900e784475434d5b96174b177611a8626043ea3459068f0958d23f68"} Apr 20 20:09:53.276385 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:53.275825 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"efc7d93f-75ee-4556-990b-07aebff23d49","Type":"ContainerStarted","Data":"09839f946076c0eb65adb625621453fb5b84ed811dd33e43c041794711e843eb"} Apr 20 20:09:53.276385 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:53.275834 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"efc7d93f-75ee-4556-990b-07aebff23d49","Type":"ContainerStarted","Data":"249b95c09fdd64e9640646fa14f19af4b09c7d9319f2b40283537265cdc6f022"} Apr 20 20:09:53.276385 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:53.275844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"efc7d93f-75ee-4556-990b-07aebff23d49","Type":"ContainerStarted","Data":"d14252461c2d2f8c5d6e36ada82ddb836c7eaa76ed8e9d5bbf4e0776856354e5"} Apr 20 20:09:53.276385 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:53.275852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"efc7d93f-75ee-4556-990b-07aebff23d49","Type":"ContainerStarted","Data":"1a3a94a89c54e80001eee01bec5c6e680210409a4df94c1d8a6b0c2eb220c631"} Apr 20 20:09:53.303060 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:53.303020 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.303004765 podStartE2EDuration="2.303004765s" podCreationTimestamp="2026-04-20 20:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:09:53.300618944 +0000 UTC m=+248.409674650" watchObservedRunningTime="2026-04-20 20:09:53.303004765 +0000 UTC m=+248.412060471" Apr 20 20:09:57.214603 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:57.214562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:09:57.216929 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:57.216909 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89492d29-88c3-44e3-adc2-eda0304a1081-metrics-certs\") pod \"network-metrics-daemon-kzfxf\" (UID: \"89492d29-88c3-44e3-adc2-eda0304a1081\") " pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:09:57.282083 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:57.282063 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gwc22\"" Apr 20 20:09:57.290597 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:57.290581 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kzfxf" Apr 20 20:09:57.407233 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:57.407201 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kzfxf"] Apr 20 20:09:57.409883 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:09:57.409858 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89492d29_88c3_44e3_adc2_eda0304a1081.slice/crio-4a071d890e5480ca316cc2ac66c9f769d64c8f8f557303e62fd9662c894eda0c WatchSource:0}: Error finding container 4a071d890e5480ca316cc2ac66c9f769d64c8f8f557303e62fd9662c894eda0c: Status 404 returned error can't find the container with id 4a071d890e5480ca316cc2ac66c9f769d64c8f8f557303e62fd9662c894eda0c Apr 20 20:09:58.293474 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:58.293439 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kzfxf" event={"ID":"89492d29-88c3-44e3-adc2-eda0304a1081","Type":"ContainerStarted","Data":"4a071d890e5480ca316cc2ac66c9f769d64c8f8f557303e62fd9662c894eda0c"} Apr 20 20:09:59.299207 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:59.299167 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kzfxf" event={"ID":"89492d29-88c3-44e3-adc2-eda0304a1081","Type":"ContainerStarted","Data":"2bcc585c0c8fbf55a7b8efea402cb713cd31de3debfd089ca0b91fde7f36b8ab"} Apr 20 20:09:59.299207 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:59.299204 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kzfxf" event={"ID":"89492d29-88c3-44e3-adc2-eda0304a1081","Type":"ContainerStarted","Data":"12951f78d6253dabf705dea39c17680ee5286be7824ccda6c7d11d54ed109787"} Apr 20 20:09:59.314636 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:09:59.314587 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kzfxf" podStartSLOduration=253.317181803 podStartE2EDuration="4m14.314573004s" podCreationTimestamp="2026-04-20 20:05:45 +0000 UTC" firstStartedPulling="2026-04-20 20:09:57.411674621 +0000 UTC m=+252.520730306" lastFinishedPulling="2026-04-20 20:09:58.40906582 +0000 UTC m=+253.518121507" observedRunningTime="2026-04-20 20:09:59.313495977 +0000 UTC m=+254.422551684" watchObservedRunningTime="2026-04-20 20:09:59.314573004 +0000 UTC m=+254.423628711" Apr 20 20:10:44.387967 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.387934 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kfprs"] Apr 20 20:10:44.391389 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.391366 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.393599 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.393580 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:10:44.402027 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.402005 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kfprs"] Apr 20 20:10:44.455913 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.455888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/82247d4d-96be-48e7-adfa-641c9ae39250-kubelet-config\") pod \"global-pull-secret-syncer-kfprs\" (UID: \"82247d4d-96be-48e7-adfa-641c9ae39250\") " pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.456033 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.455921 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/82247d4d-96be-48e7-adfa-641c9ae39250-dbus\") pod \"global-pull-secret-syncer-kfprs\" (UID: \"82247d4d-96be-48e7-adfa-641c9ae39250\") " pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.456033 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.455946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82247d4d-96be-48e7-adfa-641c9ae39250-original-pull-secret\") pod \"global-pull-secret-syncer-kfprs\" (UID: \"82247d4d-96be-48e7-adfa-641c9ae39250\") " pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.557051 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.557022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/82247d4d-96be-48e7-adfa-641c9ae39250-dbus\") pod \"global-pull-secret-syncer-kfprs\" (UID: \"82247d4d-96be-48e7-adfa-641c9ae39250\") " pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.557214 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.557061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82247d4d-96be-48e7-adfa-641c9ae39250-original-pull-secret\") pod \"global-pull-secret-syncer-kfprs\" (UID: \"82247d4d-96be-48e7-adfa-641c9ae39250\") " pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.557214 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.557127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/82247d4d-96be-48e7-adfa-641c9ae39250-kubelet-config\") pod \"global-pull-secret-syncer-kfprs\" (UID: \"82247d4d-96be-48e7-adfa-641c9ae39250\") " pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.557214 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.557198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/82247d4d-96be-48e7-adfa-641c9ae39250-kubelet-config\") pod \"global-pull-secret-syncer-kfprs\" (UID: \"82247d4d-96be-48e7-adfa-641c9ae39250\") " pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.557378 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.557236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/82247d4d-96be-48e7-adfa-641c9ae39250-dbus\") pod \"global-pull-secret-syncer-kfprs\" (UID: \"82247d4d-96be-48e7-adfa-641c9ae39250\") " pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.559412 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.559397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82247d4d-96be-48e7-adfa-641c9ae39250-original-pull-secret\") pod \"global-pull-secret-syncer-kfprs\" (UID: \"82247d4d-96be-48e7-adfa-641c9ae39250\") " pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.700612 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.700576 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kfprs" Apr 20 20:10:44.819511 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:44.819483 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kfprs"] Apr 20 20:10:44.823657 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:10:44.823628 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82247d4d_96be_48e7_adfa_641c9ae39250.slice/crio-2087a9b786da5116883cd67ab00806afe0ca129a67e73a6f4696079c5039d2db WatchSource:0}: Error finding container 2087a9b786da5116883cd67ab00806afe0ca129a67e73a6f4696079c5039d2db: Status 404 returned error can't find the container with id 2087a9b786da5116883cd67ab00806afe0ca129a67e73a6f4696079c5039d2db Apr 20 20:10:45.357511 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:45.357477 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:10:45.357952 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:45.357926 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:10:45.374037 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:45.374014 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:10:45.374561 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:45.374540 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:10:45.378031 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:45.378013 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:10:45.436108 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:45.435977 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kfprs" event={"ID":"82247d4d-96be-48e7-adfa-641c9ae39250","Type":"ContainerStarted","Data":"2087a9b786da5116883cd67ab00806afe0ca129a67e73a6f4696079c5039d2db"} Apr 20 20:10:49.449219 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:49.449182 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kfprs" event={"ID":"82247d4d-96be-48e7-adfa-641c9ae39250","Type":"ContainerStarted","Data":"951e23ca70ad2ad024b36918bc0d217bdfb8f6cf3fb43634279d24bfccb563c8"} Apr 20 20:10:49.464857 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:10:49.464816 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kfprs" podStartSLOduration=1.352724974 podStartE2EDuration="5.464802917s" podCreationTimestamp="2026-04-20 20:10:44 +0000 UTC" firstStartedPulling="2026-04-20 20:10:44.825227708 +0000 UTC m=+299.934283397" lastFinishedPulling="2026-04-20 20:10:48.937305643 +0000 UTC m=+304.046361340" observedRunningTime="2026-04-20 20:10:49.463095199 +0000 UTC m=+304.572150906" watchObservedRunningTime="2026-04-20 20:10:49.464802917 +0000 UTC m=+304.573858624" Apr 20 20:13:09.963337 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.963303 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-7rvkf"] Apr 20 20:13:09.966698 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.966682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:09.969569 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.969545 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 20 20:13:09.969713 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.969574 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-v855k\"" Apr 20 20:13:09.969940 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.969919 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 20 20:13:09.970150 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.970136 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 20 20:13:09.970813 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.970790 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc"] Apr 20 20:13:09.973898 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.973878 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:09.976065 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.976044 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-7rvkf"] Apr 20 20:13:09.976641 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.976626 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 20 20:13:09.976727 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.976654 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-5tgsg\"" Apr 20 20:13:09.987850 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:09.987825 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc"] Apr 20 20:13:10.090744 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.090712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecdac342-be1e-4f85-96c2-05c18f648c1f-cert\") pod \"kserve-controller-manager-6f655776dd-7rvkf\" (UID: \"ecdac342-be1e-4f85-96c2-05c18f648c1f\") " pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:10.090883 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.090751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8f9fc\" (UID: \"6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:10.090883 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.090783 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2t4s\" (UniqueName: \"kubernetes.io/projected/ecdac342-be1e-4f85-96c2-05c18f648c1f-kube-api-access-r2t4s\") pod \"kserve-controller-manager-6f655776dd-7rvkf\" (UID: \"ecdac342-be1e-4f85-96c2-05c18f648c1f\") " pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:10.090883 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.090836 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-685b7\" (UniqueName: \"kubernetes.io/projected/6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f-kube-api-access-685b7\") pod \"llmisvc-controller-manager-68cc5db7c4-8f9fc\" (UID: \"6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:10.191228 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.191197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecdac342-be1e-4f85-96c2-05c18f648c1f-cert\") pod \"kserve-controller-manager-6f655776dd-7rvkf\" (UID: \"ecdac342-be1e-4f85-96c2-05c18f648c1f\") " pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:10.191375 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.191234 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8f9fc\" (UID: \"6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:10.191375 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.191303 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2t4s\" (UniqueName: \"kubernetes.io/projected/ecdac342-be1e-4f85-96c2-05c18f648c1f-kube-api-access-r2t4s\") pod \"kserve-controller-manager-6f655776dd-7rvkf\" (UID: \"ecdac342-be1e-4f85-96c2-05c18f648c1f\") " pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:10.191375 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.191326 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-685b7\" (UniqueName: \"kubernetes.io/projected/6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f-kube-api-access-685b7\") pod \"llmisvc-controller-manager-68cc5db7c4-8f9fc\" (UID: \"6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:10.191523 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:13:10.191502 2574 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 20 20:13:10.191601 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:13:10.191590 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f-cert podName:6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f nodeName:}" failed. No retries permitted until 2026-04-20 20:13:10.691569084 +0000 UTC m=+445.800624772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f-cert") pod "llmisvc-controller-manager-68cc5db7c4-8f9fc" (UID: "6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f") : secret "llmisvc-webhook-server-cert" not found Apr 20 20:13:10.193761 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.193736 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecdac342-be1e-4f85-96c2-05c18f648c1f-cert\") pod \"kserve-controller-manager-6f655776dd-7rvkf\" (UID: \"ecdac342-be1e-4f85-96c2-05c18f648c1f\") " pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:10.200253 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.200227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2t4s\" (UniqueName: \"kubernetes.io/projected/ecdac342-be1e-4f85-96c2-05c18f648c1f-kube-api-access-r2t4s\") pod \"kserve-controller-manager-6f655776dd-7rvkf\" (UID: \"ecdac342-be1e-4f85-96c2-05c18f648c1f\") " pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:10.200727 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.200706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-685b7\" (UniqueName: \"kubernetes.io/projected/6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f-kube-api-access-685b7\") pod \"llmisvc-controller-manager-68cc5db7c4-8f9fc\" (UID: \"6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:10.279506 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.279451 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:10.404660 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.404637 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-7rvkf"] Apr 20 20:13:10.406513 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:13:10.406488 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecdac342_be1e_4f85_96c2_05c18f648c1f.slice/crio-b5f24ec95058df0134a3731eed7bba7dada0de22b57e86ba107f332cd2536c7e WatchSource:0}: Error finding container b5f24ec95058df0134a3731eed7bba7dada0de22b57e86ba107f332cd2536c7e: Status 404 returned error can't find the container with id b5f24ec95058df0134a3731eed7bba7dada0de22b57e86ba107f332cd2536c7e Apr 20 20:13:10.407749 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.407734 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:13:10.695831 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.695788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8f9fc\" (UID: \"6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:10.698191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.698171 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8f9fc\" (UID: \"6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:10.846056 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.846025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" event={"ID":"ecdac342-be1e-4f85-96c2-05c18f648c1f","Type":"ContainerStarted","Data":"b5f24ec95058df0134a3731eed7bba7dada0de22b57e86ba107f332cd2536c7e"} Apr 20 20:13:10.887374 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:10.887349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:11.011016 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:11.010982 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc"] Apr 20 20:13:11.015717 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:13:11.015678 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6fef7ec2_36fc_4184_b7f1_0b2c5cbbc91f.slice/crio-52f165edf1c713b1c22b9274c80194996cf18a4ecb9459272ab9f95cadad03e6 WatchSource:0}: Error finding container 52f165edf1c713b1c22b9274c80194996cf18a4ecb9459272ab9f95cadad03e6: Status 404 returned error can't find the container with id 52f165edf1c713b1c22b9274c80194996cf18a4ecb9459272ab9f95cadad03e6 Apr 20 20:13:11.853473 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:11.853409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" event={"ID":"6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f","Type":"ContainerStarted","Data":"52f165edf1c713b1c22b9274c80194996cf18a4ecb9459272ab9f95cadad03e6"} Apr 20 20:13:14.865848 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:14.865811 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" event={"ID":"6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f","Type":"ContainerStarted","Data":"a4f35118b76522b8cea7eb872e921c48a2eb3aba7339d8779145608b765bd130"} Apr 20 20:13:14.866250 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:14.865938 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:14.867081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:14.867065 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" event={"ID":"ecdac342-be1e-4f85-96c2-05c18f648c1f","Type":"ContainerStarted","Data":"e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383"} Apr 20 20:13:14.867208 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:14.867197 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:14.884567 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:14.884519 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" podStartSLOduration=2.8380264029999998 podStartE2EDuration="5.884476365s" podCreationTimestamp="2026-04-20 20:13:09 +0000 UTC" firstStartedPulling="2026-04-20 20:13:11.017911249 +0000 UTC m=+446.126966950" lastFinishedPulling="2026-04-20 20:13:14.064361227 +0000 UTC m=+449.173416912" observedRunningTime="2026-04-20 20:13:14.883178204 +0000 UTC m=+449.992233910" watchObservedRunningTime="2026-04-20 20:13:14.884476365 +0000 UTC m=+449.993532077" Apr 20 20:13:14.898890 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:14.898854 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" podStartSLOduration=2.294151984 podStartE2EDuration="5.898841683s" podCreationTimestamp="2026-04-20 20:13:09 +0000 UTC" firstStartedPulling="2026-04-20 20:13:10.407855418 +0000 UTC m=+445.516911104" lastFinishedPulling="2026-04-20 20:13:14.012545119 +0000 UTC m=+449.121600803" observedRunningTime="2026-04-20 20:13:14.898637588 +0000 UTC m=+450.007693311" watchObservedRunningTime="2026-04-20 20:13:14.898841683 +0000 UTC m=+450.007897390" Apr 20 20:13:45.872153 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:45.872106 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8f9fc" Apr 20 20:13:45.875236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:45.875216 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:47.060684 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.060654 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-7rvkf"] Apr 20 20:13:47.061055 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.060840 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" podUID="ecdac342-be1e-4f85-96c2-05c18f648c1f" containerName="manager" containerID="cri-o://e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383" gracePeriod=10 Apr 20 20:13:47.089276 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.089242 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-d9cwf"] Apr 20 20:13:47.092400 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.092382 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:13:47.103696 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.103675 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-d9cwf"] Apr 20 20:13:47.185289 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.184573 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc6rb\" (UniqueName: \"kubernetes.io/projected/fcacfdd2-50a1-4d88-b72f-c1de2da9cad6-kube-api-access-rc6rb\") pod \"kserve-controller-manager-6f655776dd-d9cwf\" (UID: \"fcacfdd2-50a1-4d88-b72f-c1de2da9cad6\") " pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:13:47.185289 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.184634 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcacfdd2-50a1-4d88-b72f-c1de2da9cad6-cert\") pod \"kserve-controller-manager-6f655776dd-d9cwf\" (UID: \"fcacfdd2-50a1-4d88-b72f-c1de2da9cad6\") " pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:13:47.285524 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.285495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rc6rb\" (UniqueName: \"kubernetes.io/projected/fcacfdd2-50a1-4d88-b72f-c1de2da9cad6-kube-api-access-rc6rb\") pod \"kserve-controller-manager-6f655776dd-d9cwf\" (UID: \"fcacfdd2-50a1-4d88-b72f-c1de2da9cad6\") " pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:13:47.285644 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.285536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcacfdd2-50a1-4d88-b72f-c1de2da9cad6-cert\") pod \"kserve-controller-manager-6f655776dd-d9cwf\" (UID: \"fcacfdd2-50a1-4d88-b72f-c1de2da9cad6\") " pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:13:47.288085 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.288052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcacfdd2-50a1-4d88-b72f-c1de2da9cad6-cert\") pod \"kserve-controller-manager-6f655776dd-d9cwf\" (UID: \"fcacfdd2-50a1-4d88-b72f-c1de2da9cad6\") " pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:13:47.295094 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.295065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc6rb\" (UniqueName: \"kubernetes.io/projected/fcacfdd2-50a1-4d88-b72f-c1de2da9cad6-kube-api-access-rc6rb\") pod \"kserve-controller-manager-6f655776dd-d9cwf\" (UID: \"fcacfdd2-50a1-4d88-b72f-c1de2da9cad6\") " pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:13:47.305663 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.305640 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:47.386707 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.386640 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2t4s\" (UniqueName: \"kubernetes.io/projected/ecdac342-be1e-4f85-96c2-05c18f648c1f-kube-api-access-r2t4s\") pod \"ecdac342-be1e-4f85-96c2-05c18f648c1f\" (UID: \"ecdac342-be1e-4f85-96c2-05c18f648c1f\") " Apr 20 20:13:47.386707 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.386698 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecdac342-be1e-4f85-96c2-05c18f648c1f-cert\") pod \"ecdac342-be1e-4f85-96c2-05c18f648c1f\" (UID: \"ecdac342-be1e-4f85-96c2-05c18f648c1f\") " Apr 20 20:13:47.388877 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.388846 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdac342-be1e-4f85-96c2-05c18f648c1f-cert" (OuterVolumeSpecName: "cert") pod "ecdac342-be1e-4f85-96c2-05c18f648c1f" (UID: "ecdac342-be1e-4f85-96c2-05c18f648c1f"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:13:47.388877 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.388849 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdac342-be1e-4f85-96c2-05c18f648c1f-kube-api-access-r2t4s" (OuterVolumeSpecName: "kube-api-access-r2t4s") pod "ecdac342-be1e-4f85-96c2-05c18f648c1f" (UID: "ecdac342-be1e-4f85-96c2-05c18f648c1f"). InnerVolumeSpecName "kube-api-access-r2t4s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:13:47.442057 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.442031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:13:47.487867 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.487838 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r2t4s\" (UniqueName: \"kubernetes.io/projected/ecdac342-be1e-4f85-96c2-05c18f648c1f-kube-api-access-r2t4s\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:13:47.488049 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.488034 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecdac342-be1e-4f85-96c2-05c18f648c1f-cert\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:13:47.575805 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.575778 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-d9cwf"] Apr 20 20:13:47.578058 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:13:47.578034 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcacfdd2_50a1_4d88_b72f_c1de2da9cad6.slice/crio-aecacb0828fbd05d6c8a8beed0c792ef10e650a0bf6b50ea6f16067bdd37f2de WatchSource:0}: Error finding container aecacb0828fbd05d6c8a8beed0c792ef10e650a0bf6b50ea6f16067bdd37f2de: Status 404 returned error can't find the container with id aecacb0828fbd05d6c8a8beed0c792ef10e650a0bf6b50ea6f16067bdd37f2de Apr 20 20:13:47.973392 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.973356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" event={"ID":"fcacfdd2-50a1-4d88-b72f-c1de2da9cad6","Type":"ContainerStarted","Data":"aecacb0828fbd05d6c8a8beed0c792ef10e650a0bf6b50ea6f16067bdd37f2de"} Apr 20 20:13:47.974324 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.974298 2574 generic.go:358] "Generic (PLEG): container finished" podID="ecdac342-be1e-4f85-96c2-05c18f648c1f" containerID="e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383" exitCode=0 Apr 20 20:13:47.974388 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.974336 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" event={"ID":"ecdac342-be1e-4f85-96c2-05c18f648c1f","Type":"ContainerDied","Data":"e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383"} Apr 20 20:13:47.974388 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.974354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" event={"ID":"ecdac342-be1e-4f85-96c2-05c18f648c1f","Type":"ContainerDied","Data":"b5f24ec95058df0134a3731eed7bba7dada0de22b57e86ba107f332cd2536c7e"} Apr 20 20:13:47.974388 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.974360 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-7rvkf" Apr 20 20:13:47.974388 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.974371 2574 scope.go:117] "RemoveContainer" containerID="e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383" Apr 20 20:13:47.985586 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.985564 2574 scope.go:117] "RemoveContainer" containerID="e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383" Apr 20 20:13:47.985952 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:13:47.985934 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383\": container with ID starting with e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383 not found: ID does not exist" containerID="e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383" Apr 20 20:13:47.986024 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.985959 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383"} err="failed to get container status \"e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383\": rpc error: code = NotFound desc = could not find container \"e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383\": container with ID starting with e36ff0da8f04016eb2dcd61505d9f89ffdfe5350a030648377d073d6ae11e383 not found: ID does not exist" Apr 20 20:13:47.996297 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:47.996237 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-7rvkf"] Apr 20 20:13:48.003276 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:48.003240 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-7rvkf"] Apr 20 20:13:48.978676 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:48.978642 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" event={"ID":"fcacfdd2-50a1-4d88-b72f-c1de2da9cad6","Type":"ContainerStarted","Data":"e46777131022c72e5a899e87d844985cd896f0f59baa957236b87682ea90016c"} Apr 20 20:13:48.979070 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:48.978721 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:13:48.997724 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:48.997675 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" podStartSLOduration=1.584953485 podStartE2EDuration="1.997662165s" podCreationTimestamp="2026-04-20 20:13:47 +0000 UTC" firstStartedPulling="2026-04-20 20:13:47.579358517 +0000 UTC m=+482.688414201" lastFinishedPulling="2026-04-20 20:13:47.992067194 +0000 UTC m=+483.101122881" observedRunningTime="2026-04-20 20:13:48.996812396 +0000 UTC m=+484.105868102" watchObservedRunningTime="2026-04-20 20:13:48.997662165 +0000 UTC m=+484.106717930" Apr 20 20:13:49.482341 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:13:49.482304 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdac342-be1e-4f85-96c2-05c18f648c1f" path="/var/lib/kubelet/pods/ecdac342-be1e-4f85-96c2-05c18f648c1f/volumes" Apr 20 20:14:19.987096 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:14:19.987066 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-d9cwf" Apr 20 20:15:06.249814 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.249783 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-659fp"] Apr 20 20:15:06.250251 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.250114 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecdac342-be1e-4f85-96c2-05c18f648c1f" containerName="manager" Apr 20 20:15:06.250251 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.250125 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdac342-be1e-4f85-96c2-05c18f648c1f" containerName="manager" Apr 20 20:15:06.250251 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.250170 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecdac342-be1e-4f85-96c2-05c18f648c1f" containerName="manager" Apr 20 20:15:06.253011 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.252995 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-659fp" Apr 20 20:15:06.255277 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.255241 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 20 20:15:06.255277 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.255255 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-pxbpx\"" Apr 20 20:15:06.260242 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.260224 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-659fp"] Apr 20 20:15:06.332650 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.332619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sckv\" (UniqueName: \"kubernetes.io/projected/01fd8c5f-b99c-42a6-b588-dd208b4c22de-kube-api-access-6sckv\") pod \"s3-tls-init-serving-659fp\" (UID: \"01fd8c5f-b99c-42a6-b588-dd208b4c22de\") " pod="kserve/s3-tls-init-serving-659fp" Apr 20 20:15:06.433570 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.433540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sckv\" (UniqueName: \"kubernetes.io/projected/01fd8c5f-b99c-42a6-b588-dd208b4c22de-kube-api-access-6sckv\") pod \"s3-tls-init-serving-659fp\" (UID: \"01fd8c5f-b99c-42a6-b588-dd208b4c22de\") " pod="kserve/s3-tls-init-serving-659fp" Apr 20 20:15:06.442394 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.442362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sckv\" (UniqueName: \"kubernetes.io/projected/01fd8c5f-b99c-42a6-b588-dd208b4c22de-kube-api-access-6sckv\") pod \"s3-tls-init-serving-659fp\" (UID: \"01fd8c5f-b99c-42a6-b588-dd208b4c22de\") " pod="kserve/s3-tls-init-serving-659fp" Apr 20 20:15:06.576739 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.576651 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-659fp" Apr 20 20:15:06.692699 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:06.692579 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-659fp"] Apr 20 20:15:06.695723 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:15:06.695699 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01fd8c5f_b99c_42a6_b588_dd208b4c22de.slice/crio-a35d914890f896b1976b313e42a48259091d274eee8c842a15fd32ba5534171b WatchSource:0}: Error finding container a35d914890f896b1976b313e42a48259091d274eee8c842a15fd32ba5534171b: Status 404 returned error can't find the container with id a35d914890f896b1976b313e42a48259091d274eee8c842a15fd32ba5534171b Apr 20 20:15:07.211629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:07.211555 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-659fp" event={"ID":"01fd8c5f-b99c-42a6-b588-dd208b4c22de","Type":"ContainerStarted","Data":"a35d914890f896b1976b313e42a48259091d274eee8c842a15fd32ba5534171b"} Apr 20 20:15:11.225852 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:11.225825 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-659fp" event={"ID":"01fd8c5f-b99c-42a6-b588-dd208b4c22de","Type":"ContainerStarted","Data":"445384afe0f3f94dfd576794f73b6156bffb0c736d38be6c09cd1b13bcbe3bdd"} Apr 20 20:15:11.240794 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:11.240741 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-659fp" podStartSLOduration=0.7910938 podStartE2EDuration="5.240723316s" podCreationTimestamp="2026-04-20 20:15:06 +0000 UTC" firstStartedPulling="2026-04-20 20:15:06.697745584 +0000 UTC m=+561.806801269" lastFinishedPulling="2026-04-20 20:15:11.147375098 +0000 UTC m=+566.256430785" observedRunningTime="2026-04-20 20:15:11.239206292 +0000 UTC m=+566.348262001" watchObservedRunningTime="2026-04-20 20:15:11.240723316 +0000 UTC m=+566.349779024" Apr 20 20:15:16.724616 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:15:16.724577 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01fd8c5f_b99c_42a6_b588_dd208b4c22de.slice/crio-445384afe0f3f94dfd576794f73b6156bffb0c736d38be6c09cd1b13bcbe3bdd.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:15:17.245676 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:17.245639 2574 generic.go:358] "Generic (PLEG): container finished" podID="01fd8c5f-b99c-42a6-b588-dd208b4c22de" containerID="445384afe0f3f94dfd576794f73b6156bffb0c736d38be6c09cd1b13bcbe3bdd" exitCode=0 Apr 20 20:15:17.245843 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:17.245718 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-659fp" event={"ID":"01fd8c5f-b99c-42a6-b588-dd208b4c22de","Type":"ContainerDied","Data":"445384afe0f3f94dfd576794f73b6156bffb0c736d38be6c09cd1b13bcbe3bdd"} Apr 20 20:15:18.373685 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:18.373661 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-659fp" Apr 20 20:15:18.448633 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:18.448598 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sckv\" (UniqueName: \"kubernetes.io/projected/01fd8c5f-b99c-42a6-b588-dd208b4c22de-kube-api-access-6sckv\") pod \"01fd8c5f-b99c-42a6-b588-dd208b4c22de\" (UID: \"01fd8c5f-b99c-42a6-b588-dd208b4c22de\") " Apr 20 20:15:18.450832 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:18.450796 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fd8c5f-b99c-42a6-b588-dd208b4c22de-kube-api-access-6sckv" (OuterVolumeSpecName: "kube-api-access-6sckv") pod "01fd8c5f-b99c-42a6-b588-dd208b4c22de" (UID: "01fd8c5f-b99c-42a6-b588-dd208b4c22de"). InnerVolumeSpecName "kube-api-access-6sckv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:18.549239 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:18.549152 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6sckv\" (UniqueName: \"kubernetes.io/projected/01fd8c5f-b99c-42a6-b588-dd208b4c22de-kube-api-access-6sckv\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:15:19.252962 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:19.252925 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-659fp" Apr 20 20:15:19.252962 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:19.252946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-659fp" event={"ID":"01fd8c5f-b99c-42a6-b588-dd208b4c22de","Type":"ContainerDied","Data":"a35d914890f896b1976b313e42a48259091d274eee8c842a15fd32ba5534171b"} Apr 20 20:15:19.253163 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:19.252972 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a35d914890f896b1976b313e42a48259091d274eee8c842a15fd32ba5534171b" Apr 20 20:15:45.390769 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:45.390740 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:15:45.392776 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:45.392752 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:15:45.396756 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:45.396734 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:15:45.398985 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:15:45.398971 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:20:45.412469 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:20:45.412436 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:20:45.416429 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:20:45.416408 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:20:45.419557 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:20:45.419533 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:20:45.425750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:20:45.424406 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:25:45.436826 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:25:45.436789 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:25:45.443354 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:25:45.443329 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:25:45.443967 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:25:45.443943 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:25:45.449207 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:25:45.449189 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:30:27.690980 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.690943 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7"] Apr 20 20:30:27.691460 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.691284 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01fd8c5f-b99c-42a6-b588-dd208b4c22de" containerName="s3-tls-init-serving" Apr 20 20:30:27.691460 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.691295 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fd8c5f-b99c-42a6-b588-dd208b4c22de" containerName="s3-tls-init-serving" Apr 20 20:30:27.691460 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.691383 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="01fd8c5f-b99c-42a6-b588-dd208b4c22de" containerName="s3-tls-init-serving" Apr 20 20:30:27.694558 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.694537 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.696800 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.696779 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 20 20:30:27.696898 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.696830 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 20 20:30:27.696945 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.696892 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:30:27.697614 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.697598 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:30:27.697687 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.697607 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:30:27.704511 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.704486 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7"] Apr 20 20:30:27.787158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.787104 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a899a23-4f4d-4ee2-924d-226945e8cf76-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.787358 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.787172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a899a23-4f4d-4ee2-924d-226945e8cf76-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.787358 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.787202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqcvh\" (UniqueName: \"kubernetes.io/projected/4a899a23-4f4d-4ee2-924d-226945e8cf76-kube-api-access-kqcvh\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.787358 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.787228 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a899a23-4f4d-4ee2-924d-226945e8cf76-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.887744 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.887710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a899a23-4f4d-4ee2-924d-226945e8cf76-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.887920 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.887770 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a899a23-4f4d-4ee2-924d-226945e8cf76-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.887920 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.887831 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a899a23-4f4d-4ee2-924d-226945e8cf76-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.887920 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.887850 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqcvh\" (UniqueName: \"kubernetes.io/projected/4a899a23-4f4d-4ee2-924d-226945e8cf76-kube-api-access-kqcvh\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.888191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.888169 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a899a23-4f4d-4ee2-924d-226945e8cf76-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.888480 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.888461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a899a23-4f4d-4ee2-924d-226945e8cf76-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.890471 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.890449 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a899a23-4f4d-4ee2-924d-226945e8cf76-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:27.896388 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:27.896363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqcvh\" (UniqueName: \"kubernetes.io/projected/4a899a23-4f4d-4ee2-924d-226945e8cf76-kube-api-access-kqcvh\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:28.005440 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:28.005349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:28.127858 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:28.127833 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7"] Apr 20 20:30:28.130466 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:30:28.130440 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a899a23_4f4d_4ee2_924d_226945e8cf76.slice/crio-31b3649ec4880e5c028a775d1d51482ec7c3d9c6d6fac6460b1e4e18f87e47fb WatchSource:0}: Error finding container 31b3649ec4880e5c028a775d1d51482ec7c3d9c6d6fac6460b1e4e18f87e47fb: Status 404 returned error can't find the container with id 31b3649ec4880e5c028a775d1d51482ec7c3d9c6d6fac6460b1e4e18f87e47fb Apr 20 20:30:28.132411 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:28.132395 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:30:28.959657 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:28.959619 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" event={"ID":"4a899a23-4f4d-4ee2-924d-226945e8cf76","Type":"ContainerStarted","Data":"31b3649ec4880e5c028a775d1d51482ec7c3d9c6d6fac6460b1e4e18f87e47fb"} Apr 20 20:30:32.975651 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:32.975608 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" event={"ID":"4a899a23-4f4d-4ee2-924d-226945e8cf76","Type":"ContainerStarted","Data":"4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545"} Apr 20 20:30:36.988739 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:36.988700 2574 generic.go:358] "Generic (PLEG): container finished" podID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerID="4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545" exitCode=0 Apr 20 20:30:36.989100 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:36.988781 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" event={"ID":"4a899a23-4f4d-4ee2-924d-226945e8cf76","Type":"ContainerDied","Data":"4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545"} Apr 20 20:30:48.357940 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:48.357910 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:30:48.358430 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:48.358320 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:30:48.364177 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:48.364157 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:30:48.364623 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:48.364603 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:30:49.032798 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:49.032762 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" event={"ID":"4a899a23-4f4d-4ee2-924d-226945e8cf76","Type":"ContainerStarted","Data":"3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381"} Apr 20 20:30:52.044966 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:52.044929 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" event={"ID":"4a899a23-4f4d-4ee2-924d-226945e8cf76","Type":"ContainerStarted","Data":"14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861"} Apr 20 20:30:52.045372 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:52.045118 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:52.045372 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:52.045294 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:52.046527 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:52.046487 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 20 20:30:52.062714 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:52.062668 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" podStartSLOduration=1.955649528 podStartE2EDuration="25.062657231s" podCreationTimestamp="2026-04-20 20:30:27 +0000 UTC" firstStartedPulling="2026-04-20 20:30:28.132521939 +0000 UTC m=+1483.241577624" lastFinishedPulling="2026-04-20 20:30:51.239529638 +0000 UTC m=+1506.348585327" observedRunningTime="2026-04-20 20:30:52.061118147 +0000 UTC m=+1507.170173854" watchObservedRunningTime="2026-04-20 20:30:52.062657231 +0000 UTC m=+1507.171712937" Apr 20 20:30:53.048836 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:53.048796 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 20 20:30:58.053597 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:58.053569 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:30:58.054224 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:30:58.054199 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 20 20:31:08.055102 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:08.055056 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 20 20:31:18.054552 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:18.054511 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 20 20:31:28.054860 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:28.054820 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 20 20:31:38.055111 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:38.055082 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:31:49.138156 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.138120 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7"] Apr 20 20:31:49.138703 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.138563 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" containerID="cri-o://3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381" gracePeriod=30 Apr 20 20:31:49.138703 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.138646 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kube-rbac-proxy" containerID="cri-o://14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861" gracePeriod=30 Apr 20 20:31:49.241134 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.241101 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl"] Apr 20 20:31:49.244932 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.244909 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.247060 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.247039 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 20 20:31:49.247150 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.247089 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 20 20:31:49.255083 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.255061 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl"] Apr 20 20:31:49.369707 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.369664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95701d92-4aa3-4231-9631-f465984585d3-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.369856 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.369768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95701d92-4aa3-4231-9631-f465984585d3-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.369856 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.369827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95701d92-4aa3-4231-9631-f465984585d3-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.369856 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.369854 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dffd\" (UniqueName: \"kubernetes.io/projected/95701d92-4aa3-4231-9631-f465984585d3-kube-api-access-5dffd\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.470484 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.470453 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95701d92-4aa3-4231-9631-f465984585d3-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.470653 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.470507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95701d92-4aa3-4231-9631-f465984585d3-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.470653 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.470529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95701d92-4aa3-4231-9631-f465984585d3-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.470653 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.470545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dffd\" (UniqueName: \"kubernetes.io/projected/95701d92-4aa3-4231-9631-f465984585d3-kube-api-access-5dffd\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.470868 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.470846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95701d92-4aa3-4231-9631-f465984585d3-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.471226 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.471200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95701d92-4aa3-4231-9631-f465984585d3-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.473038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.473019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95701d92-4aa3-4231-9631-f465984585d3-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.478654 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.478632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dffd\" (UniqueName: \"kubernetes.io/projected/95701d92-4aa3-4231-9631-f465984585d3-kube-api-access-5dffd\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.555808 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.555784 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:49.676038 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:49.676007 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl"] Apr 20 20:31:49.679660 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:31:49.679637 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95701d92_4aa3_4231_9631_f465984585d3.slice/crio-ec71488baef527587444cd1610005a7ea9d8599e96bacefcabca078482aae9b9 WatchSource:0}: Error finding container ec71488baef527587444cd1610005a7ea9d8599e96bacefcabca078482aae9b9: Status 404 returned error can't find the container with id ec71488baef527587444cd1610005a7ea9d8599e96bacefcabca078482aae9b9 Apr 20 20:31:50.224701 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:50.224663 2574 generic.go:358] "Generic (PLEG): container finished" podID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerID="14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861" exitCode=2 Apr 20 20:31:50.225127 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:50.224736 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" event={"ID":"4a899a23-4f4d-4ee2-924d-226945e8cf76","Type":"ContainerDied","Data":"14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861"} Apr 20 20:31:50.226151 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:50.226113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" event={"ID":"95701d92-4aa3-4231-9631-f465984585d3","Type":"ContainerStarted","Data":"2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468"} Apr 20 20:31:50.226151 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:50.226149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" event={"ID":"95701d92-4aa3-4231-9631-f465984585d3","Type":"ContainerStarted","Data":"ec71488baef527587444cd1610005a7ea9d8599e96bacefcabca078482aae9b9"} Apr 20 20:31:51.788605 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.788574 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:31:51.890085 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.890020 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqcvh\" (UniqueName: \"kubernetes.io/projected/4a899a23-4f4d-4ee2-924d-226945e8cf76-kube-api-access-kqcvh\") pod \"4a899a23-4f4d-4ee2-924d-226945e8cf76\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " Apr 20 20:31:51.890232 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.890096 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a899a23-4f4d-4ee2-924d-226945e8cf76-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"4a899a23-4f4d-4ee2-924d-226945e8cf76\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " Apr 20 20:31:51.890232 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.890119 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a899a23-4f4d-4ee2-924d-226945e8cf76-kserve-provision-location\") pod \"4a899a23-4f4d-4ee2-924d-226945e8cf76\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " Apr 20 20:31:51.890232 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.890148 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a899a23-4f4d-4ee2-924d-226945e8cf76-proxy-tls\") pod \"4a899a23-4f4d-4ee2-924d-226945e8cf76\" (UID: \"4a899a23-4f4d-4ee2-924d-226945e8cf76\") " Apr 20 20:31:51.890535 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.890511 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a899a23-4f4d-4ee2-924d-226945e8cf76-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "4a899a23-4f4d-4ee2-924d-226945e8cf76" (UID: "4a899a23-4f4d-4ee2-924d-226945e8cf76"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:31:51.892397 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.892373 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a899a23-4f4d-4ee2-924d-226945e8cf76-kube-api-access-kqcvh" (OuterVolumeSpecName: "kube-api-access-kqcvh") pod "4a899a23-4f4d-4ee2-924d-226945e8cf76" (UID: "4a899a23-4f4d-4ee2-924d-226945e8cf76"). InnerVolumeSpecName "kube-api-access-kqcvh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:31:51.892488 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.892400 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a899a23-4f4d-4ee2-924d-226945e8cf76-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4a899a23-4f4d-4ee2-924d-226945e8cf76" (UID: "4a899a23-4f4d-4ee2-924d-226945e8cf76"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:31:51.898122 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.898101 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a899a23-4f4d-4ee2-924d-226945e8cf76-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a899a23-4f4d-4ee2-924d-226945e8cf76" (UID: "4a899a23-4f4d-4ee2-924d-226945e8cf76"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:31:51.990997 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.990967 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kqcvh\" (UniqueName: \"kubernetes.io/projected/4a899a23-4f4d-4ee2-924d-226945e8cf76-kube-api-access-kqcvh\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:31:51.990997 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.990993 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a899a23-4f4d-4ee2-924d-226945e8cf76-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:31:51.991158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.991005 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a899a23-4f4d-4ee2-924d-226945e8cf76-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:31:51.991158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:51.991016 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a899a23-4f4d-4ee2-924d-226945e8cf76-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:31:52.235137 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.235101 2574 generic.go:358] "Generic (PLEG): container finished" podID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerID="3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381" exitCode=0 Apr 20 20:31:52.235314 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.235145 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" event={"ID":"4a899a23-4f4d-4ee2-924d-226945e8cf76","Type":"ContainerDied","Data":"3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381"} Apr 20 20:31:52.235314 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.235169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" event={"ID":"4a899a23-4f4d-4ee2-924d-226945e8cf76","Type":"ContainerDied","Data":"31b3649ec4880e5c028a775d1d51482ec7c3d9c6d6fac6460b1e4e18f87e47fb"} Apr 20 20:31:52.235314 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.235184 2574 scope.go:117] "RemoveContainer" containerID="14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861" Apr 20 20:31:52.235314 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.235182 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7" Apr 20 20:31:52.243734 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.243716 2574 scope.go:117] "RemoveContainer" containerID="3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381" Apr 20 20:31:52.250660 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.250644 2574 scope.go:117] "RemoveContainer" containerID="4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545" Apr 20 20:31:52.256353 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.256320 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7"] Apr 20 20:31:52.258209 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.258192 2574 scope.go:117] "RemoveContainer" containerID="14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861" Apr 20 20:31:52.258494 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:31:52.258475 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861\": container with ID starting with 14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861 not found: ID does not exist" containerID="14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861" Apr 20 20:31:52.258561 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.258503 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861"} err="failed to get container status \"14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861\": rpc error: code = NotFound desc = could not find container \"14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861\": container with ID starting with 14abbc4bb39121bc69c8c53303ab460d5dd6c54e71829e8b3c96c8d27fb07861 not found: ID does not exist" Apr 20 20:31:52.258561 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.258525 2574 scope.go:117] "RemoveContainer" containerID="3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381" Apr 20 20:31:52.258751 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:31:52.258734 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381\": container with ID starting with 3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381 not found: ID does not exist" containerID="3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381" Apr 20 20:31:52.258795 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.258753 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381"} err="failed to get container status \"3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381\": rpc error: code = NotFound desc = could not find container \"3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381\": container with ID starting with 3742d9bd0c1cdfffeab6a67d9c98b8dc585282a07bdc4387285253797a6e6381 not found: ID does not exist" Apr 20 20:31:52.258795 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.258765 2574 scope.go:117] "RemoveContainer" containerID="4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545" Apr 20 20:31:52.259015 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:31:52.258991 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545\": container with ID starting with 4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545 not found: ID does not exist" containerID="4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545" Apr 20 20:31:52.259054 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.259026 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545"} err="failed to get container status \"4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545\": rpc error: code = NotFound desc = could not find container \"4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545\": container with ID starting with 4a458755f4c8c9ec642a15a5da287f1a248a2a6f5d8b382f340f01a480f51545 not found: ID does not exist" Apr 20 20:31:52.261333 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:52.261309 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-x4gj7"] Apr 20 20:31:53.483738 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:53.483695 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" path="/var/lib/kubelet/pods/4a899a23-4f4d-4ee2-924d-226945e8cf76/volumes" Apr 20 20:31:54.243357 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:54.243256 2574 generic.go:358] "Generic (PLEG): container finished" podID="95701d92-4aa3-4231-9631-f465984585d3" containerID="2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468" exitCode=0 Apr 20 20:31:54.243357 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:54.243337 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" event={"ID":"95701d92-4aa3-4231-9631-f465984585d3","Type":"ContainerDied","Data":"2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468"} Apr 20 20:31:55.247830 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:55.247792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" event={"ID":"95701d92-4aa3-4231-9631-f465984585d3","Type":"ContainerStarted","Data":"edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f"} Apr 20 20:31:55.247830 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:55.247834 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" event={"ID":"95701d92-4aa3-4231-9631-f465984585d3","Type":"ContainerStarted","Data":"e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5"} Apr 20 20:31:55.248389 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:55.248051 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:55.265814 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:55.265761 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podStartSLOduration=6.265743497 podStartE2EDuration="6.265743497s" podCreationTimestamp="2026-04-20 20:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:31:55.264175857 +0000 UTC m=+1570.373231564" watchObservedRunningTime="2026-04-20 20:31:55.265743497 +0000 UTC m=+1570.374799206" Apr 20 20:31:56.251438 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:56.251411 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:31:56.252734 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:56.252705 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 20 20:31:57.254348 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:31:57.254310 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 20 20:32:02.259322 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:02.259286 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:32:02.259844 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:02.259819 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 20 20:32:12.260819 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:12.260773 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 20 20:32:22.260649 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:22.260564 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 20 20:32:32.259868 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:32.259823 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 20 20:32:42.261139 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:42.261112 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:32:50.951437 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:50.951400 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl"] Apr 20 20:32:50.951866 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:50.951731 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" containerID="cri-o://e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5" gracePeriod=30 Apr 20 20:32:50.951866 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:50.951802 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kube-rbac-proxy" containerID="cri-o://edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f" gracePeriod=30 Apr 20 20:32:51.427544 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:51.427510 2574 generic.go:358] "Generic (PLEG): container finished" podID="95701d92-4aa3-4231-9631-f465984585d3" containerID="edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f" exitCode=2 Apr 20 20:32:51.427728 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:51.427584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" event={"ID":"95701d92-4aa3-4231-9631-f465984585d3","Type":"ContainerDied","Data":"edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f"} Apr 20 20:32:52.254836 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:52.254795 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 20 20:32:52.260213 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:52.260188 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 20 20:32:53.707523 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.707495 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:32:53.811570 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.811477 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95701d92-4aa3-4231-9631-f465984585d3-proxy-tls\") pod \"95701d92-4aa3-4231-9631-f465984585d3\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " Apr 20 20:32:53.811570 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.811517 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dffd\" (UniqueName: \"kubernetes.io/projected/95701d92-4aa3-4231-9631-f465984585d3-kube-api-access-5dffd\") pod \"95701d92-4aa3-4231-9631-f465984585d3\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " Apr 20 20:32:53.811570 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.811562 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95701d92-4aa3-4231-9631-f465984585d3-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"95701d92-4aa3-4231-9631-f465984585d3\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " Apr 20 20:32:53.811865 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.811595 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95701d92-4aa3-4231-9631-f465984585d3-kserve-provision-location\") pod \"95701d92-4aa3-4231-9631-f465984585d3\" (UID: \"95701d92-4aa3-4231-9631-f465984585d3\") " Apr 20 20:32:53.811992 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.811967 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95701d92-4aa3-4231-9631-f465984585d3-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "95701d92-4aa3-4231-9631-f465984585d3" (UID: "95701d92-4aa3-4231-9631-f465984585d3"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:32:53.813759 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.813731 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95701d92-4aa3-4231-9631-f465984585d3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "95701d92-4aa3-4231-9631-f465984585d3" (UID: "95701d92-4aa3-4231-9631-f465984585d3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:32:53.813759 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.813749 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95701d92-4aa3-4231-9631-f465984585d3-kube-api-access-5dffd" (OuterVolumeSpecName: "kube-api-access-5dffd") pod "95701d92-4aa3-4231-9631-f465984585d3" (UID: "95701d92-4aa3-4231-9631-f465984585d3"). InnerVolumeSpecName "kube-api-access-5dffd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:32:53.821246 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.821217 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95701d92-4aa3-4231-9631-f465984585d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "95701d92-4aa3-4231-9631-f465984585d3" (UID: "95701d92-4aa3-4231-9631-f465984585d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:32:53.912575 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.912531 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95701d92-4aa3-4231-9631-f465984585d3-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:32:53.912575 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.912573 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95701d92-4aa3-4231-9631-f465984585d3-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:32:53.912575 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.912584 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95701d92-4aa3-4231-9631-f465984585d3-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:32:53.912808 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:53.912595 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5dffd\" (UniqueName: \"kubernetes.io/projected/95701d92-4aa3-4231-9631-f465984585d3-kube-api-access-5dffd\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:32:54.438251 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.438217 2574 generic.go:358] "Generic (PLEG): container finished" podID="95701d92-4aa3-4231-9631-f465984585d3" containerID="e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5" exitCode=0 Apr 20 20:32:54.438450 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.438289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" event={"ID":"95701d92-4aa3-4231-9631-f465984585d3","Type":"ContainerDied","Data":"e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5"} Apr 20 20:32:54.438450 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.438319 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" event={"ID":"95701d92-4aa3-4231-9631-f465984585d3","Type":"ContainerDied","Data":"ec71488baef527587444cd1610005a7ea9d8599e96bacefcabca078482aae9b9"} Apr 20 20:32:54.438450 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.438327 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl" Apr 20 20:32:54.438450 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.438334 2574 scope.go:117] "RemoveContainer" containerID="edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f" Apr 20 20:32:54.449961 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.449939 2574 scope.go:117] "RemoveContainer" containerID="e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5" Apr 20 20:32:54.457703 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.457682 2574 scope.go:117] "RemoveContainer" containerID="2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468" Apr 20 20:32:54.461917 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.461890 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl"] Apr 20 20:32:54.465880 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.465843 2574 scope.go:117] "RemoveContainer" containerID="edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f" Apr 20 20:32:54.466152 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:32:54.466134 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f\": container with ID starting with edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f not found: ID does not exist" containerID="edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f" Apr 20 20:32:54.466240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.466163 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f"} err="failed to get container status \"edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f\": rpc error: code = NotFound desc = could not find container \"edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f\": container with ID starting with edb874d5c93e9fe3f82ff7bf1ce02c8b4f66805e137242969e11c7925e75ef1f not found: ID does not exist" Apr 20 20:32:54.466240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.466190 2574 scope.go:117] "RemoveContainer" containerID="e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5" Apr 20 20:32:54.466476 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:32:54.466458 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5\": container with ID starting with e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5 not found: ID does not exist" containerID="e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5" Apr 20 20:32:54.466531 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.466483 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5"} err="failed to get container status \"e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5\": rpc error: code = NotFound desc = could not find container \"e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5\": container with ID starting with e36706328487a8eab5c27a5dbd74e5c3656c895bbec5d5c07defdf11e31453c5 not found: ID does not exist" Apr 20 20:32:54.466531 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.466500 2574 scope.go:117] "RemoveContainer" containerID="2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468" Apr 20 20:32:54.466738 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:32:54.466720 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468\": container with ID starting with 2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468 not found: ID does not exist" containerID="2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468" Apr 20 20:32:54.466793 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.466748 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468"} err="failed to get container status \"2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468\": rpc error: code = NotFound desc = could not find container \"2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468\": container with ID starting with 2f5050ae1688ca4ab16c59069f707beda55c437b6acbce4565d154f01a7c9468 not found: ID does not exist" Apr 20 20:32:54.467292 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:54.467253 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-4vfcl"] Apr 20 20:32:55.482327 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:32:55.482297 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95701d92-4aa3-4231-9631-f465984585d3" path="/var/lib/kubelet/pods/95701d92-4aa3-4231-9631-f465984585d3/volumes" Apr 20 20:34:22.103427 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103380 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh"] Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103735 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kube-rbac-proxy" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103745 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kube-rbac-proxy" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103753 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kube-rbac-proxy" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103760 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kube-rbac-proxy" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103769 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103774 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103786 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="storage-initializer" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103792 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="storage-initializer" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103803 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="storage-initializer" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103808 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="storage-initializer" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103815 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103819 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103866 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kube-rbac-proxy" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103874 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="95701d92-4aa3-4231-9631-f465984585d3" containerName="kserve-container" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103880 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kube-rbac-proxy" Apr 20 20:34:22.103921 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.103889 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a899a23-4f4d-4ee2-924d-226945e8cf76" containerName="kserve-container" Apr 20 20:34:22.107136 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.107117 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.109587 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.109561 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:34:22.109587 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.109581 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:34:22.109833 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.109564 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 20 20:34:22.110152 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.110116 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 20 20:34:22.110239 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.110154 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:34:22.116281 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.116237 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh"] Apr 20 20:34:22.135209 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.135175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/532f694b-81c1-4c96-a31e-5c93dfa93257-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.135345 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.135237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/532f694b-81c1-4c96-a31e-5c93dfa93257-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.135402 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.135349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x67j\" (UniqueName: \"kubernetes.io/projected/532f694b-81c1-4c96-a31e-5c93dfa93257-kube-api-access-2x67j\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.135471 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.135454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/532f694b-81c1-4c96-a31e-5c93dfa93257-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.236101 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.236059 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/532f694b-81c1-4c96-a31e-5c93dfa93257-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.236301 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.236111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/532f694b-81c1-4c96-a31e-5c93dfa93257-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.236301 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.236143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x67j\" (UniqueName: \"kubernetes.io/projected/532f694b-81c1-4c96-a31e-5c93dfa93257-kube-api-access-2x67j\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.236301 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.236194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/532f694b-81c1-4c96-a31e-5c93dfa93257-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.236455 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:34:22.236326 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-runtime-predictor-serving-cert: secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 20 20:34:22.236455 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:34:22.236397 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532f694b-81c1-4c96-a31e-5c93dfa93257-proxy-tls podName:532f694b-81c1-4c96-a31e-5c93dfa93257 nodeName:}" failed. No retries permitted until 2026-04-20 20:34:22.736375659 +0000 UTC m=+1717.845431344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/532f694b-81c1-4c96-a31e-5c93dfa93257-proxy-tls") pod "isvc-pmml-runtime-predictor-67bc544947-4vnbh" (UID: "532f694b-81c1-4c96-a31e-5c93dfa93257") : secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 20 20:34:22.236554 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.236534 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/532f694b-81c1-4c96-a31e-5c93dfa93257-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.236858 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.236842 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/532f694b-81c1-4c96-a31e-5c93dfa93257-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.244634 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.244614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x67j\" (UniqueName: \"kubernetes.io/projected/532f694b-81c1-4c96-a31e-5c93dfa93257-kube-api-access-2x67j\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.740670 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.740635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/532f694b-81c1-4c96-a31e-5c93dfa93257-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:22.743290 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:22.743243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/532f694b-81c1-4c96-a31e-5c93dfa93257-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-4vnbh\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:23.019942 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:23.019840 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:23.145007 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:23.144982 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh"] Apr 20 20:34:23.147684 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:34:23.147656 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod532f694b_81c1_4c96_a31e_5c93dfa93257.slice/crio-a39aab2656a0a4c5880cdb066e4b90fdb20920af39fd3c371161e18c33b3ec11 WatchSource:0}: Error finding container a39aab2656a0a4c5880cdb066e4b90fdb20920af39fd3c371161e18c33b3ec11: Status 404 returned error can't find the container with id a39aab2656a0a4c5880cdb066e4b90fdb20920af39fd3c371161e18c33b3ec11 Apr 20 20:34:23.706834 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:23.706796 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" event={"ID":"532f694b-81c1-4c96-a31e-5c93dfa93257","Type":"ContainerStarted","Data":"972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82"} Apr 20 20:34:23.706834 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:23.706841 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" event={"ID":"532f694b-81c1-4c96-a31e-5c93dfa93257","Type":"ContainerStarted","Data":"a39aab2656a0a4c5880cdb066e4b90fdb20920af39fd3c371161e18c33b3ec11"} Apr 20 20:34:27.720089 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:27.720048 2574 generic.go:358] "Generic (PLEG): container finished" podID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerID="972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82" exitCode=0 Apr 20 20:34:27.720522 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:27.720124 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" event={"ID":"532f694b-81c1-4c96-a31e-5c93dfa93257","Type":"ContainerDied","Data":"972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82"} Apr 20 20:34:34.751987 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:34.751946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" event={"ID":"532f694b-81c1-4c96-a31e-5c93dfa93257","Type":"ContainerStarted","Data":"e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d"} Apr 20 20:34:34.752345 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:34.751997 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" event={"ID":"532f694b-81c1-4c96-a31e-5c93dfa93257","Type":"ContainerStarted","Data":"3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6"} Apr 20 20:34:34.752345 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:34.752230 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:34.768810 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:34.768755 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podStartSLOduration=5.894209922 podStartE2EDuration="12.768737809s" podCreationTimestamp="2026-04-20 20:34:22 +0000 UTC" firstStartedPulling="2026-04-20 20:34:27.721313563 +0000 UTC m=+1722.830369251" lastFinishedPulling="2026-04-20 20:34:34.595841438 +0000 UTC m=+1729.704897138" observedRunningTime="2026-04-20 20:34:34.767756543 +0000 UTC m=+1729.876812260" watchObservedRunningTime="2026-04-20 20:34:34.768737809 +0000 UTC m=+1729.877793517" Apr 20 20:34:35.754951 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:35.754923 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:35.756174 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:35.756141 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:34:36.758071 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:36.758037 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:34:41.762497 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:41.762469 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:34:41.763048 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:41.763021 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:34:51.763687 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:34:51.763635 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:35:01.763531 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:01.763489 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:35:11.763964 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:11.763925 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:35:21.763236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:21.763150 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:35:31.763013 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:31.762968 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:35:41.763072 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:41.763030 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:35:48.385320 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:48.385293 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:35:48.385857 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:48.385716 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:35:48.391728 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:48.391708 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:35:48.392274 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:48.392244 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:35:51.763774 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:35:51.763732 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 20 20:36:01.763419 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:01.763390 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:36:02.944580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:02.944549 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh"] Apr 20 20:36:02.944990 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:02.944844 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" containerID="cri-o://3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6" gracePeriod=30 Apr 20 20:36:02.944990 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:02.944878 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kube-rbac-proxy" containerID="cri-o://e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d" gracePeriod=30 Apr 20 20:36:04.031469 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:04.031431 2574 generic.go:358] "Generic (PLEG): container finished" podID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerID="e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d" exitCode=2 Apr 20 20:36:04.031837 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:04.031509 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" event={"ID":"532f694b-81c1-4c96-a31e-5c93dfa93257","Type":"ContainerDied","Data":"e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d"} Apr 20 20:36:06.783526 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.783503 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:36:06.827758 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.827727 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x67j\" (UniqueName: \"kubernetes.io/projected/532f694b-81c1-4c96-a31e-5c93dfa93257-kube-api-access-2x67j\") pod \"532f694b-81c1-4c96-a31e-5c93dfa93257\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " Apr 20 20:36:06.827941 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.827778 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/532f694b-81c1-4c96-a31e-5c93dfa93257-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"532f694b-81c1-4c96-a31e-5c93dfa93257\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " Apr 20 20:36:06.827941 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.827805 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/532f694b-81c1-4c96-a31e-5c93dfa93257-kserve-provision-location\") pod \"532f694b-81c1-4c96-a31e-5c93dfa93257\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " Apr 20 20:36:06.828064 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.827944 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/532f694b-81c1-4c96-a31e-5c93dfa93257-proxy-tls\") pod \"532f694b-81c1-4c96-a31e-5c93dfa93257\" (UID: \"532f694b-81c1-4c96-a31e-5c93dfa93257\") " Apr 20 20:36:06.828133 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.828109 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/532f694b-81c1-4c96-a31e-5c93dfa93257-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "532f694b-81c1-4c96-a31e-5c93dfa93257" (UID: "532f694b-81c1-4c96-a31e-5c93dfa93257"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:36:06.828191 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.828167 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532f694b-81c1-4c96-a31e-5c93dfa93257-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "532f694b-81c1-4c96-a31e-5c93dfa93257" (UID: "532f694b-81c1-4c96-a31e-5c93dfa93257"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:36:06.828327 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.828309 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/532f694b-81c1-4c96-a31e-5c93dfa93257-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:36:06.828397 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.828329 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/532f694b-81c1-4c96-a31e-5c93dfa93257-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:36:06.830177 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.830149 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532f694b-81c1-4c96-a31e-5c93dfa93257-kube-api-access-2x67j" (OuterVolumeSpecName: "kube-api-access-2x67j") pod "532f694b-81c1-4c96-a31e-5c93dfa93257" (UID: "532f694b-81c1-4c96-a31e-5c93dfa93257"). InnerVolumeSpecName "kube-api-access-2x67j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:36:06.830327 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.830179 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532f694b-81c1-4c96-a31e-5c93dfa93257-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "532f694b-81c1-4c96-a31e-5c93dfa93257" (UID: "532f694b-81c1-4c96-a31e-5c93dfa93257"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:36:06.928750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.928665 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2x67j\" (UniqueName: \"kubernetes.io/projected/532f694b-81c1-4c96-a31e-5c93dfa93257-kube-api-access-2x67j\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:36:06.928750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:06.928697 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/532f694b-81c1-4c96-a31e-5c93dfa93257-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:36:07.044990 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.044960 2574 generic.go:358] "Generic (PLEG): container finished" podID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerID="3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6" exitCode=0 Apr 20 20:36:07.045148 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.045040 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" Apr 20 20:36:07.045148 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.045053 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" event={"ID":"532f694b-81c1-4c96-a31e-5c93dfa93257","Type":"ContainerDied","Data":"3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6"} Apr 20 20:36:07.045148 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.045090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" event={"ID":"532f694b-81c1-4c96-a31e-5c93dfa93257","Type":"ContainerDied","Data":"a39aab2656a0a4c5880cdb066e4b90fdb20920af39fd3c371161e18c33b3ec11"} Apr 20 20:36:07.045148 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.045107 2574 scope.go:117] "RemoveContainer" containerID="e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d" Apr 20 20:36:07.053773 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.053755 2574 scope.go:117] "RemoveContainer" containerID="3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6" Apr 20 20:36:07.061029 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.061015 2574 scope.go:117] "RemoveContainer" containerID="972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82" Apr 20 20:36:07.064858 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.064835 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh"] Apr 20 20:36:07.068457 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.068436 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh"] Apr 20 20:36:07.069003 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.068990 2574 scope.go:117] "RemoveContainer" containerID="e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d" Apr 20 20:36:07.069302 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:36:07.069282 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d\": container with ID starting with e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d not found: ID does not exist" containerID="e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d" Apr 20 20:36:07.069372 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.069310 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d"} err="failed to get container status \"e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d\": rpc error: code = NotFound desc = could not find container \"e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d\": container with ID starting with e26342691b90eb669de31a6adecd31fa180a5d01fac141bcdaec7333dcc6d56d not found: ID does not exist" Apr 20 20:36:07.069372 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.069328 2574 scope.go:117] "RemoveContainer" containerID="3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6" Apr 20 20:36:07.069576 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:36:07.069558 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6\": container with ID starting with 3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6 not found: ID does not exist" containerID="3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6" Apr 20 20:36:07.069620 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.069580 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6"} err="failed to get container status \"3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6\": rpc error: code = NotFound desc = could not find container \"3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6\": container with ID starting with 3d83ff893cf71abab32c72b393a64be6c7c1acd7055833e68bb920b2dca866a6 not found: ID does not exist" Apr 20 20:36:07.069620 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.069592 2574 scope.go:117] "RemoveContainer" containerID="972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82" Apr 20 20:36:07.069818 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:36:07.069800 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82\": container with ID starting with 972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82 not found: ID does not exist" containerID="972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82" Apr 20 20:36:07.069859 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.069822 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82"} err="failed to get container status \"972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82\": rpc error: code = NotFound desc = could not find container \"972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82\": container with ID starting with 972102c2c53a0e955ccbf8e50579d2055fba2766c474cc29513efcdf4b048b82 not found: ID does not exist" Apr 20 20:36:07.482369 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.482332 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" path="/var/lib/kubelet/pods/532f694b-81c1-4c96-a31e-5c93dfa93257/volumes" Apr 20 20:36:07.759116 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:36:07.759017 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4vnbh" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 20 20:39:18.102637 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.102603 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx"] Apr 20 20:39:18.103081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.102964 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kube-rbac-proxy" Apr 20 20:39:18.103081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.102976 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kube-rbac-proxy" Apr 20 20:39:18.103081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.102986 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="storage-initializer" Apr 20 20:39:18.103081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.102991 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="storage-initializer" Apr 20 20:39:18.103081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.102999 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" Apr 20 20:39:18.103081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.103005 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" Apr 20 20:39:18.103081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.103057 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kube-rbac-proxy" Apr 20 20:39:18.103081 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.103067 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="532f694b-81c1-4c96-a31e-5c93dfa93257" containerName="kserve-container" Apr 20 20:39:18.106314 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.106295 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.108636 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.108614 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:39:18.108768 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.108677 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:39:18.108768 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.108696 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 20 20:39:18.108768 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.108701 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:39:18.108768 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.108701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 20 20:39:18.115131 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.115107 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx"] Apr 20 20:39:18.198589 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.198550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7ls\" (UniqueName: \"kubernetes.io/projected/4551802b-8bcf-4904-85e5-fb5533355a94-kube-api-access-bg7ls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.198756 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.198599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4551802b-8bcf-4904-85e5-fb5533355a94-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.198756 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.198649 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4551802b-8bcf-4904-85e5-fb5533355a94-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.198829 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.198759 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4551802b-8bcf-4904-85e5-fb5533355a94-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.299510 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.299456 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4551802b-8bcf-4904-85e5-fb5533355a94-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.299712 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.299547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4551802b-8bcf-4904-85e5-fb5533355a94-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.299712 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.299628 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7ls\" (UniqueName: \"kubernetes.io/projected/4551802b-8bcf-4904-85e5-fb5533355a94-kube-api-access-bg7ls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.299712 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.299671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4551802b-8bcf-4904-85e5-fb5533355a94-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.299896 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.299856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4551802b-8bcf-4904-85e5-fb5533355a94-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.300146 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.300125 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4551802b-8bcf-4904-85e5-fb5533355a94-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.302198 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.302176 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4551802b-8bcf-4904-85e5-fb5533355a94-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.307915 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.307894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7ls\" (UniqueName: \"kubernetes.io/projected/4551802b-8bcf-4904-85e5-fb5533355a94-kube-api-access-bg7ls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-26kcx\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.417704 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.417677 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:18.546481 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.546448 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx"] Apr 20 20:39:18.548194 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:39:18.548164 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4551802b_8bcf_4904_85e5_fb5533355a94.slice/crio-668a321c0957bedc9ac712c1433bc60ade51cdd8ad05e2fe34944c8164cb6d33 WatchSource:0}: Error finding container 668a321c0957bedc9ac712c1433bc60ade51cdd8ad05e2fe34944c8164cb6d33: Status 404 returned error can't find the container with id 668a321c0957bedc9ac712c1433bc60ade51cdd8ad05e2fe34944c8164cb6d33 Apr 20 20:39:18.550170 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.550151 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:39:18.610533 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:18.610505 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" event={"ID":"4551802b-8bcf-4904-85e5-fb5533355a94","Type":"ContainerStarted","Data":"668a321c0957bedc9ac712c1433bc60ade51cdd8ad05e2fe34944c8164cb6d33"} Apr 20 20:39:19.615716 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:19.615683 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" event={"ID":"4551802b-8bcf-4904-85e5-fb5533355a94","Type":"ContainerStarted","Data":"d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c"} Apr 20 20:39:22.626180 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:22.626106 2574 generic.go:358] "Generic (PLEG): container finished" podID="4551802b-8bcf-4904-85e5-fb5533355a94" containerID="d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c" exitCode=0 Apr 20 20:39:22.626627 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:22.626179 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" event={"ID":"4551802b-8bcf-4904-85e5-fb5533355a94","Type":"ContainerDied","Data":"d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c"} Apr 20 20:39:44.709583 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:44.709497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" event={"ID":"4551802b-8bcf-4904-85e5-fb5533355a94","Type":"ContainerStarted","Data":"6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53"} Apr 20 20:39:44.709583 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:44.709543 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" event={"ID":"4551802b-8bcf-4904-85e5-fb5533355a94","Type":"ContainerStarted","Data":"5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e"} Apr 20 20:39:44.710102 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:44.709773 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:44.728958 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:44.728915 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podStartSLOduration=4.941800245 podStartE2EDuration="26.728902737s" podCreationTimestamp="2026-04-20 20:39:18 +0000 UTC" firstStartedPulling="2026-04-20 20:39:22.627429331 +0000 UTC m=+2017.736485019" lastFinishedPulling="2026-04-20 20:39:44.414531814 +0000 UTC m=+2039.523587511" observedRunningTime="2026-04-20 20:39:44.727748497 +0000 UTC m=+2039.836804204" watchObservedRunningTime="2026-04-20 20:39:44.728902737 +0000 UTC m=+2039.837958444" Apr 20 20:39:45.713181 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:45.713150 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:45.714245 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:45.714216 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:39:46.715894 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:46.715856 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:39:51.719964 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:51.719892 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:39:51.720517 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:39:51.720490 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:40:01.721037 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:01.720993 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:40:11.721196 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:11.721157 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:40:21.720435 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:21.720393 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:40:31.721084 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:31.721038 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:40:41.721171 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:41.721129 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:40:48.408415 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:48.408380 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:40:48.409522 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:48.409502 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:40:48.414676 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:48.414657 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:40:48.415928 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:48.415909 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:40:51.720474 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:40:51.720434 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:41:01.483410 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:01.483374 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:41:07.643242 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.643207 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx"] Apr 20 20:41:07.643725 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.643694 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" containerID="cri-o://5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e" gracePeriod=30 Apr 20 20:41:07.643825 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.643775 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kube-rbac-proxy" containerID="cri-o://6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53" gracePeriod=30 Apr 20 20:41:07.784163 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.784129 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg"] Apr 20 20:41:07.787472 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.787455 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.789873 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.789846 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 20 20:41:07.790013 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.789889 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 20 20:41:07.798356 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.798334 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg"] Apr 20 20:41:07.848169 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.848146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjdz4\" (UniqueName: \"kubernetes.io/projected/ad3a4fef-e087-471e-9ae7-879f0864939a-kube-api-access-mjdz4\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.848327 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.848186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad3a4fef-e087-471e-9ae7-879f0864939a-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.848327 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.848251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3a4fef-e087-471e-9ae7-879f0864939a-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.848327 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.848310 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad3a4fef-e087-471e-9ae7-879f0864939a-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.949609 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.949582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3a4fef-e087-471e-9ae7-879f0864939a-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.949774 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.949620 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad3a4fef-e087-471e-9ae7-879f0864939a-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.949774 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.949665 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjdz4\" (UniqueName: \"kubernetes.io/projected/ad3a4fef-e087-471e-9ae7-879f0864939a-kube-api-access-mjdz4\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.949774 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.949690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad3a4fef-e087-471e-9ae7-879f0864939a-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.949774 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:41:07.949739 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-serving-cert: secret "isvc-predictive-xgboost-predictor-serving-cert" not found Apr 20 20:41:07.949938 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:41:07.949825 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad3a4fef-e087-471e-9ae7-879f0864939a-proxy-tls podName:ad3a4fef-e087-471e-9ae7-879f0864939a nodeName:}" failed. No retries permitted until 2026-04-20 20:41:08.449802444 +0000 UTC m=+2123.558858133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ad3a4fef-e087-471e-9ae7-879f0864939a-proxy-tls") pod "isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" (UID: "ad3a4fef-e087-471e-9ae7-879f0864939a") : secret "isvc-predictive-xgboost-predictor-serving-cert" not found Apr 20 20:41:07.950116 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.950096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad3a4fef-e087-471e-9ae7-879f0864939a-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.950366 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.950349 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad3a4fef-e087-471e-9ae7-879f0864939a-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:07.957394 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.957371 2574 generic.go:358] "Generic (PLEG): container finished" podID="4551802b-8bcf-4904-85e5-fb5533355a94" containerID="6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53" exitCode=2 Apr 20 20:41:07.957509 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.957421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" event={"ID":"4551802b-8bcf-4904-85e5-fb5533355a94","Type":"ContainerDied","Data":"6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53"} Apr 20 20:41:07.960278 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:07.960247 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjdz4\" (UniqueName: \"kubernetes.io/projected/ad3a4fef-e087-471e-9ae7-879f0864939a-kube-api-access-mjdz4\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:08.453975 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:08.453944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3a4fef-e087-471e-9ae7-879f0864939a-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:08.456495 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:08.456477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3a4fef-e087-471e-9ae7-879f0864939a-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:08.697765 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:08.697729 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:08.818204 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:08.818181 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg"] Apr 20 20:41:08.821032 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:41:08.821004 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad3a4fef_e087_471e_9ae7_879f0864939a.slice/crio-3161a8555589cf18ff7be805842557e6ec0eba94dcfc209c9496aae337793b77 WatchSource:0}: Error finding container 3161a8555589cf18ff7be805842557e6ec0eba94dcfc209c9496aae337793b77: Status 404 returned error can't find the container with id 3161a8555589cf18ff7be805842557e6ec0eba94dcfc209c9496aae337793b77 Apr 20 20:41:08.961741 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:08.961658 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" event={"ID":"ad3a4fef-e087-471e-9ae7-879f0864939a","Type":"ContainerStarted","Data":"25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b"} Apr 20 20:41:08.961741 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:08.961694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" event={"ID":"ad3a4fef-e087-471e-9ae7-879f0864939a","Type":"ContainerStarted","Data":"3161a8555589cf18ff7be805842557e6ec0eba94dcfc209c9496aae337793b77"} Apr 20 20:41:11.483369 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:11.483321 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 20 20:41:11.716835 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:11.716790 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 20 20:41:12.279899 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.279874 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:41:12.387127 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.387047 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4551802b-8bcf-4904-85e5-fb5533355a94-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"4551802b-8bcf-4904-85e5-fb5533355a94\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " Apr 20 20:41:12.387312 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.387155 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg7ls\" (UniqueName: \"kubernetes.io/projected/4551802b-8bcf-4904-85e5-fb5533355a94-kube-api-access-bg7ls\") pod \"4551802b-8bcf-4904-85e5-fb5533355a94\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " Apr 20 20:41:12.387312 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.387221 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4551802b-8bcf-4904-85e5-fb5533355a94-kserve-provision-location\") pod \"4551802b-8bcf-4904-85e5-fb5533355a94\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " Apr 20 20:41:12.387430 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.387328 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4551802b-8bcf-4904-85e5-fb5533355a94-proxy-tls\") pod \"4551802b-8bcf-4904-85e5-fb5533355a94\" (UID: \"4551802b-8bcf-4904-85e5-fb5533355a94\") " Apr 20 20:41:12.387467 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.387441 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4551802b-8bcf-4904-85e5-fb5533355a94-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "4551802b-8bcf-4904-85e5-fb5533355a94" (UID: "4551802b-8bcf-4904-85e5-fb5533355a94"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:41:12.387539 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.387517 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4551802b-8bcf-4904-85e5-fb5533355a94-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4551802b-8bcf-4904-85e5-fb5533355a94" (UID: "4551802b-8bcf-4904-85e5-fb5533355a94"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:41:12.387692 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.387669 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4551802b-8bcf-4904-85e5-fb5533355a94-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:41:12.387692 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.387689 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4551802b-8bcf-4904-85e5-fb5533355a94-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:41:12.389433 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.389409 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4551802b-8bcf-4904-85e5-fb5533355a94-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4551802b-8bcf-4904-85e5-fb5533355a94" (UID: "4551802b-8bcf-4904-85e5-fb5533355a94"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:41:12.389523 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.389465 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4551802b-8bcf-4904-85e5-fb5533355a94-kube-api-access-bg7ls" (OuterVolumeSpecName: "kube-api-access-bg7ls") pod "4551802b-8bcf-4904-85e5-fb5533355a94" (UID: "4551802b-8bcf-4904-85e5-fb5533355a94"). InnerVolumeSpecName "kube-api-access-bg7ls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:41:12.488211 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.488178 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4551802b-8bcf-4904-85e5-fb5533355a94-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:41:12.488211 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.488206 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bg7ls\" (UniqueName: \"kubernetes.io/projected/4551802b-8bcf-4904-85e5-fb5533355a94-kube-api-access-bg7ls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:41:12.978732 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.978695 2574 generic.go:358] "Generic (PLEG): container finished" podID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerID="25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b" exitCode=0 Apr 20 20:41:12.978999 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.978772 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" event={"ID":"ad3a4fef-e087-471e-9ae7-879f0864939a","Type":"ContainerDied","Data":"25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b"} Apr 20 20:41:12.980401 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.980376 2574 generic.go:358] "Generic (PLEG): container finished" podID="4551802b-8bcf-4904-85e5-fb5533355a94" containerID="5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e" exitCode=0 Apr 20 20:41:12.980582 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.980444 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" event={"ID":"4551802b-8bcf-4904-85e5-fb5533355a94","Type":"ContainerDied","Data":"5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e"} Apr 20 20:41:12.980582 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.980459 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" Apr 20 20:41:12.980582 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.980481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx" event={"ID":"4551802b-8bcf-4904-85e5-fb5533355a94","Type":"ContainerDied","Data":"668a321c0957bedc9ac712c1433bc60ade51cdd8ad05e2fe34944c8164cb6d33"} Apr 20 20:41:12.980582 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.980498 2574 scope.go:117] "RemoveContainer" containerID="6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53" Apr 20 20:41:12.989142 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.989127 2574 scope.go:117] "RemoveContainer" containerID="5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e" Apr 20 20:41:12.998052 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:12.998035 2574 scope.go:117] "RemoveContainer" containerID="d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c" Apr 20 20:41:13.007443 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.007425 2574 scope.go:117] "RemoveContainer" containerID="6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53" Apr 20 20:41:13.007737 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:41:13.007719 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53\": container with ID starting with 6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53 not found: ID does not exist" containerID="6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53" Apr 20 20:41:13.007830 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.007749 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53"} err="failed to get container status \"6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53\": rpc error: code = NotFound desc = could not find container \"6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53\": container with ID starting with 6eae934abf46283839df6873cdc4efae94726e7cfae0114934f52c4435067c53 not found: ID does not exist" Apr 20 20:41:13.007830 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.007773 2574 scope.go:117] "RemoveContainer" containerID="5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e" Apr 20 20:41:13.007830 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.007799 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx"] Apr 20 20:41:13.008078 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:41:13.008026 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e\": container with ID starting with 5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e not found: ID does not exist" containerID="5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e" Apr 20 20:41:13.008078 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.008053 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e"} err="failed to get container status \"5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e\": rpc error: code = NotFound desc = could not find container \"5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e\": container with ID starting with 5c37df23aa505af653d7c2a05d2c39b37d2f10dc05ff09578fbde7811d42ac4e not found: ID does not exist" Apr 20 20:41:13.008078 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.008078 2574 scope.go:117] "RemoveContainer" containerID="d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c" Apr 20 20:41:13.008365 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:41:13.008340 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c\": container with ID starting with d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c not found: ID does not exist" containerID="d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c" Apr 20 20:41:13.008483 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.008371 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c"} err="failed to get container status \"d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c\": rpc error: code = NotFound desc = could not find container \"d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c\": container with ID starting with d3bae8a8324cb5c9f28b9968914e077d3c0a1a310167663c898dc38f01fdc86c not found: ID does not exist" Apr 20 20:41:13.009436 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.009417 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-26kcx"] Apr 20 20:41:13.483173 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.483129 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" path="/var/lib/kubelet/pods/4551802b-8bcf-4904-85e5-fb5533355a94/volumes" Apr 20 20:41:13.985237 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.985200 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" event={"ID":"ad3a4fef-e087-471e-9ae7-879f0864939a","Type":"ContainerStarted","Data":"4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526"} Apr 20 20:41:13.985237 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.985238 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" event={"ID":"ad3a4fef-e087-471e-9ae7-879f0864939a","Type":"ContainerStarted","Data":"78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a"} Apr 20 20:41:13.985762 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.985563 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:13.985762 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.985734 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:13.986805 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:13.986776 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:41:14.002186 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:14.002127 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podStartSLOduration=7.002111106 podStartE2EDuration="7.002111106s" podCreationTimestamp="2026-04-20 20:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:41:14.001401867 +0000 UTC m=+2129.110457597" watchObservedRunningTime="2026-04-20 20:41:14.002111106 +0000 UTC m=+2129.111166813" Apr 20 20:41:14.989255 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:14.989207 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:41:19.994050 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:19.994017 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:41:19.994523 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:19.994487 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:41:29.994856 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:29.994758 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:41:39.994925 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:39.994877 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:41:49.995047 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:49.995006 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:41:59.995190 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:41:59.995149 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:42:09.994893 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:09.994856 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:42:19.994600 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:19.994561 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:42:29.995672 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:29.995644 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:42:37.882120 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.882087 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg"] Apr 20 20:42:37.882694 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.882516 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" containerID="cri-o://78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a" gracePeriod=30 Apr 20 20:42:37.882694 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.882599 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kube-rbac-proxy" containerID="cri-o://4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526" gracePeriod=30 Apr 20 20:42:37.999095 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.999055 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb"] Apr 20 20:42:37.999499 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.999480 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="storage-initializer" Apr 20 20:42:37.999597 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.999500 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="storage-initializer" Apr 20 20:42:37.999597 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.999543 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kube-rbac-proxy" Apr 20 20:42:37.999597 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.999551 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kube-rbac-proxy" Apr 20 20:42:37.999597 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.999563 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" Apr 20 20:42:37.999597 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.999571 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" Apr 20 20:42:37.999837 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.999667 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kserve-container" Apr 20 20:42:37.999837 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:37.999686 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4551802b-8bcf-4904-85e5-fb5533355a94" containerName="kube-rbac-proxy" Apr 20 20:42:38.002766 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.002736 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.005086 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.005061 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 20 20:42:38.005249 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.005166 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 20 20:42:38.014621 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.011383 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb"] Apr 20 20:42:38.191335 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.191302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.191517 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.191352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkhvd\" (UniqueName: \"kubernetes.io/projected/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kube-api-access-wkhvd\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.191517 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.191481 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.191643 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.191589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.254491 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.254458 2574 generic.go:358] "Generic (PLEG): container finished" podID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerID="4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526" exitCode=2 Apr 20 20:42:38.254636 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.254539 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" event={"ID":"ad3a4fef-e087-471e-9ae7-879f0864939a","Type":"ContainerDied","Data":"4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526"} Apr 20 20:42:38.293024 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.292981 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.293220 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.293053 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.293220 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.293077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkhvd\" (UniqueName: \"kubernetes.io/projected/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kube-api-access-wkhvd\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.293220 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.293110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.293599 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.293574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.293876 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.293843 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.295828 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.295808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.300999 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.300977 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkhvd\" (UniqueName: \"kubernetes.io/projected/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kube-api-access-wkhvd\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.319897 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.319874 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:38.447119 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:38.447094 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb"] Apr 20 20:42:38.449256 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:42:38.449230 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bbbaa06_fdd4_4d95_b0be_29283f8c708e.slice/crio-b6672664fbc40787b3ef4bcd1e1a4b624a9b58b21c67fc018452541fdb5646f1 WatchSource:0}: Error finding container b6672664fbc40787b3ef4bcd1e1a4b624a9b58b21c67fc018452541fdb5646f1: Status 404 returned error can't find the container with id b6672664fbc40787b3ef4bcd1e1a4b624a9b58b21c67fc018452541fdb5646f1 Apr 20 20:42:39.259528 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:39.259494 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" event={"ID":"7bbbaa06-fdd4-4d95-b0be-29283f8c708e","Type":"ContainerStarted","Data":"23aa84d354e2e76604037b9891d5318cc0ce3cf585077634a84988042ab4dd40"} Apr 20 20:42:39.259895 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:39.259535 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" event={"ID":"7bbbaa06-fdd4-4d95-b0be-29283f8c708e","Type":"ContainerStarted","Data":"b6672664fbc40787b3ef4bcd1e1a4b624a9b58b21c67fc018452541fdb5646f1"} Apr 20 20:42:39.990079 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:39.990043 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 20 20:42:39.995281 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:39.995234 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 20 20:42:42.536120 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.534118 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:42:42.625616 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.625590 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjdz4\" (UniqueName: \"kubernetes.io/projected/ad3a4fef-e087-471e-9ae7-879f0864939a-kube-api-access-mjdz4\") pod \"ad3a4fef-e087-471e-9ae7-879f0864939a\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " Apr 20 20:42:42.625779 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.625641 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad3a4fef-e087-471e-9ae7-879f0864939a-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"ad3a4fef-e087-471e-9ae7-879f0864939a\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " Apr 20 20:42:42.625779 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.625703 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad3a4fef-e087-471e-9ae7-879f0864939a-kserve-provision-location\") pod \"ad3a4fef-e087-471e-9ae7-879f0864939a\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " Apr 20 20:42:42.625779 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.625734 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3a4fef-e087-471e-9ae7-879f0864939a-proxy-tls\") pod \"ad3a4fef-e087-471e-9ae7-879f0864939a\" (UID: \"ad3a4fef-e087-471e-9ae7-879f0864939a\") " Apr 20 20:42:42.626021 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.625994 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad3a4fef-e087-471e-9ae7-879f0864939a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad3a4fef-e087-471e-9ae7-879f0864939a" (UID: "ad3a4fef-e087-471e-9ae7-879f0864939a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:42:42.626021 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.626004 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad3a4fef-e087-471e-9ae7-879f0864939a-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "ad3a4fef-e087-471e-9ae7-879f0864939a" (UID: "ad3a4fef-e087-471e-9ae7-879f0864939a"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:42:42.627859 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.627834 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3a4fef-e087-471e-9ae7-879f0864939a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ad3a4fef-e087-471e-9ae7-879f0864939a" (UID: "ad3a4fef-e087-471e-9ae7-879f0864939a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:42:42.627859 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.627846 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3a4fef-e087-471e-9ae7-879f0864939a-kube-api-access-mjdz4" (OuterVolumeSpecName: "kube-api-access-mjdz4") pod "ad3a4fef-e087-471e-9ae7-879f0864939a" (UID: "ad3a4fef-e087-471e-9ae7-879f0864939a"). InnerVolumeSpecName "kube-api-access-mjdz4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:42:42.727117 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.727085 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad3a4fef-e087-471e-9ae7-879f0864939a-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:42:42.727117 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.727110 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3a4fef-e087-471e-9ae7-879f0864939a-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:42:42.727117 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.727122 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjdz4\" (UniqueName: \"kubernetes.io/projected/ad3a4fef-e087-471e-9ae7-879f0864939a-kube-api-access-mjdz4\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:42:42.727354 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:42.727131 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad3a4fef-e087-471e-9ae7-879f0864939a-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:42:43.273846 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.273803 2574 generic.go:358] "Generic (PLEG): container finished" podID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerID="23aa84d354e2e76604037b9891d5318cc0ce3cf585077634a84988042ab4dd40" exitCode=0 Apr 20 20:42:43.274042 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.273897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" event={"ID":"7bbbaa06-fdd4-4d95-b0be-29283f8c708e","Type":"ContainerDied","Data":"23aa84d354e2e76604037b9891d5318cc0ce3cf585077634a84988042ab4dd40"} Apr 20 20:42:43.275987 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.275801 2574 generic.go:358] "Generic (PLEG): container finished" podID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerID="78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a" exitCode=0 Apr 20 20:42:43.275987 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.275864 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" event={"ID":"ad3a4fef-e087-471e-9ae7-879f0864939a","Type":"ContainerDied","Data":"78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a"} Apr 20 20:42:43.275987 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.275890 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" Apr 20 20:42:43.275987 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.275908 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg" event={"ID":"ad3a4fef-e087-471e-9ae7-879f0864939a","Type":"ContainerDied","Data":"3161a8555589cf18ff7be805842557e6ec0eba94dcfc209c9496aae337793b77"} Apr 20 20:42:43.275987 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.275928 2574 scope.go:117] "RemoveContainer" containerID="4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526" Apr 20 20:42:43.288237 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.288223 2574 scope.go:117] "RemoveContainer" containerID="78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a" Apr 20 20:42:43.302624 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.302604 2574 scope.go:117] "RemoveContainer" containerID="25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b" Apr 20 20:42:43.303923 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.303908 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg"] Apr 20 20:42:43.308724 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.308703 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-blfvg"] Apr 20 20:42:43.316775 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.316756 2574 scope.go:117] "RemoveContainer" containerID="4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526" Apr 20 20:42:43.317037 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:42:43.317017 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526\": container with ID starting with 4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526 not found: ID does not exist" containerID="4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526" Apr 20 20:42:43.317113 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.317049 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526"} err="failed to get container status \"4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526\": rpc error: code = NotFound desc = could not find container \"4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526\": container with ID starting with 4988ec24e97027b1f0a987d00406d93e27ae5490fbc7632542dc06e253522526 not found: ID does not exist" Apr 20 20:42:43.317113 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.317073 2574 scope.go:117] "RemoveContainer" containerID="78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a" Apr 20 20:42:43.317351 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:42:43.317333 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a\": container with ID starting with 78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a not found: ID does not exist" containerID="78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a" Apr 20 20:42:43.317429 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.317360 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a"} err="failed to get container status \"78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a\": rpc error: code = NotFound desc = could not find container \"78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a\": container with ID starting with 78a63335505a845c259e5bda5637e07c994cfa5df8502420a67c908ef12ec61a not found: ID does not exist" Apr 20 20:42:43.317429 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.317380 2574 scope.go:117] "RemoveContainer" containerID="25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b" Apr 20 20:42:43.317649 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:42:43.317621 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b\": container with ID starting with 25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b not found: ID does not exist" containerID="25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b" Apr 20 20:42:43.317709 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.317656 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b"} err="failed to get container status \"25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b\": rpc error: code = NotFound desc = could not find container \"25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b\": container with ID starting with 25613abc51d4be1c1f00d83d785bc98be1578960bb8d39dd1c65aa9eecf9ee0b not found: ID does not exist" Apr 20 20:42:43.483944 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:43.483903 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" path="/var/lib/kubelet/pods/ad3a4fef-e087-471e-9ae7-879f0864939a/volumes" Apr 20 20:42:44.281054 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:44.281023 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" event={"ID":"7bbbaa06-fdd4-4d95-b0be-29283f8c708e","Type":"ContainerStarted","Data":"7a141b630dc3612b0784e232c8d01edc5225f7ff694c76069e4ba5987b6242c9"} Apr 20 20:42:44.281054 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:44.281057 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" event={"ID":"7bbbaa06-fdd4-4d95-b0be-29283f8c708e","Type":"ContainerStarted","Data":"845f3c0f91bf996160e0e3059285af83d8ccf92a99d3a5e3e746fe75a7322336"} Apr 20 20:42:44.281597 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:44.281359 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:44.281597 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:44.281384 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:44.282567 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:44.282542 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:42:44.301593 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:44.301506 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podStartSLOduration=7.301481094 podStartE2EDuration="7.301481094s" podCreationTimestamp="2026-04-20 20:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:42:44.297989744 +0000 UTC m=+2219.407045451" watchObservedRunningTime="2026-04-20 20:42:44.301481094 +0000 UTC m=+2219.410536802" Apr 20 20:42:45.285171 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:45.285134 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:42:50.291144 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:50.291113 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:42:50.291710 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:42:50.291686 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:43:00.292657 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:43:00.292614 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:43:10.292329 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:43:10.292288 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:43:20.291958 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:43:20.291919 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:43:30.291766 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:43:30.291722 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:43:40.292450 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:43:40.292413 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:43:50.291708 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:43:50.291665 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:43:53.479463 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:43:53.479412 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:44:03.482610 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:03.482580 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:44:08.151377 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.151341 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb"] Apr 20 20:44:08.151744 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.151667 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" containerID="cri-o://845f3c0f91bf996160e0e3059285af83d8ccf92a99d3a5e3e746fe75a7322336" gracePeriod=30 Apr 20 20:44:08.151744 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.151682 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kube-rbac-proxy" containerID="cri-o://7a141b630dc3612b0784e232c8d01edc5225f7ff694c76069e4ba5987b6242c9" gracePeriod=30 Apr 20 20:44:08.263927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.263894 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5"] Apr 20 20:44:08.264372 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.264339 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="storage-initializer" Apr 20 20:44:08.264372 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.264373 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="storage-initializer" Apr 20 20:44:08.264580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.264381 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" Apr 20 20:44:08.264580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.264387 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" Apr 20 20:44:08.264580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.264393 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kube-rbac-proxy" Apr 20 20:44:08.264580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.264399 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kube-rbac-proxy" Apr 20 20:44:08.264580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.264457 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kserve-container" Apr 20 20:44:08.264580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.264465 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad3a4fef-e087-471e-9ae7-879f0864939a" containerName="kube-rbac-proxy" Apr 20 20:44:08.267477 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.267458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.269830 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.269814 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 20 20:44:08.269896 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.269853 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 20 20:44:08.277834 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.277810 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5"] Apr 20 20:44:08.343973 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.343934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp2h5\" (UniqueName: \"kubernetes.io/projected/75e87178-7bd1-427a-b037-cac46cca9f3f-kube-api-access-sp2h5\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.343973 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.343977 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75e87178-7bd1-427a-b037-cac46cca9f3f-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.344203 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.343995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75e87178-7bd1-427a-b037-cac46cca9f3f-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.344203 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.344083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75e87178-7bd1-427a-b037-cac46cca9f3f-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.445461 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.445372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp2h5\" (UniqueName: \"kubernetes.io/projected/75e87178-7bd1-427a-b037-cac46cca9f3f-kube-api-access-sp2h5\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.445461 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.445417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75e87178-7bd1-427a-b037-cac46cca9f3f-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.445461 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.445442 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75e87178-7bd1-427a-b037-cac46cca9f3f-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.445697 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.445494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75e87178-7bd1-427a-b037-cac46cca9f3f-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.445697 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:44:08.445544 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-serving-cert: secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 20 20:44:08.445697 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:44:08.445614 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75e87178-7bd1-427a-b037-cac46cca9f3f-proxy-tls podName:75e87178-7bd1-427a-b037-cac46cca9f3f nodeName:}" failed. No retries permitted until 2026-04-20 20:44:08.94559192 +0000 UTC m=+2304.054647608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/75e87178-7bd1-427a-b037-cac46cca9f3f-proxy-tls") pod "isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" (UID: "75e87178-7bd1-427a-b037-cac46cca9f3f") : secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 20 20:44:08.445877 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.445856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75e87178-7bd1-427a-b037-cac46cca9f3f-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.446180 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.446160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75e87178-7bd1-427a-b037-cac46cca9f3f-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.456222 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.456197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp2h5\" (UniqueName: \"kubernetes.io/projected/75e87178-7bd1-427a-b037-cac46cca9f3f-kube-api-access-sp2h5\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.540620 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.540587 2574 generic.go:358] "Generic (PLEG): container finished" podID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerID="7a141b630dc3612b0784e232c8d01edc5225f7ff694c76069e4ba5987b6242c9" exitCode=2 Apr 20 20:44:08.540771 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.540659 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" event={"ID":"7bbbaa06-fdd4-4d95-b0be-29283f8c708e","Type":"ContainerDied","Data":"7a141b630dc3612b0784e232c8d01edc5225f7ff694c76069e4ba5987b6242c9"} Apr 20 20:44:08.949323 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.949290 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75e87178-7bd1-427a-b037-cac46cca9f3f-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:08.952047 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:08.952019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75e87178-7bd1-427a-b037-cac46cca9f3f-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:09.178437 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:09.178396 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:09.302465 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:09.302442 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5"] Apr 20 20:44:09.304799 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:44:09.304771 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e87178_7bd1_427a_b037_cac46cca9f3f.slice/crio-0cb8973a98ddd8b0b3db93c31c3fc3ef3ee55ecb3cee753114098994681a9311 WatchSource:0}: Error finding container 0cb8973a98ddd8b0b3db93c31c3fc3ef3ee55ecb3cee753114098994681a9311: Status 404 returned error can't find the container with id 0cb8973a98ddd8b0b3db93c31c3fc3ef3ee55ecb3cee753114098994681a9311 Apr 20 20:44:09.545754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:09.545667 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" event={"ID":"75e87178-7bd1-427a-b037-cac46cca9f3f","Type":"ContainerStarted","Data":"525da373b440a31071d356123793d6d1219464aaf74ed30993889c296195b586"} Apr 20 20:44:09.545754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:09.545711 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" event={"ID":"75e87178-7bd1-427a-b037-cac46cca9f3f","Type":"ContainerStarted","Data":"0cb8973a98ddd8b0b3db93c31c3fc3ef3ee55ecb3cee753114098994681a9311"} Apr 20 20:44:10.285752 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:10.285704 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 20 20:44:13.479937 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:13.479901 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 20 20:44:13.559756 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:13.559720 2574 generic.go:358] "Generic (PLEG): container finished" podID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerID="525da373b440a31071d356123793d6d1219464aaf74ed30993889c296195b586" exitCode=0 Apr 20 20:44:13.559896 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:13.559764 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" event={"ID":"75e87178-7bd1-427a-b037-cac46cca9f3f","Type":"ContainerDied","Data":"525da373b440a31071d356123793d6d1219464aaf74ed30993889c296195b586"} Apr 20 20:44:14.564534 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:14.564435 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" event={"ID":"75e87178-7bd1-427a-b037-cac46cca9f3f","Type":"ContainerStarted","Data":"814ef44bd4c5e85f3736b22b5c542bc48bc8d9b31bd43c9f0a3b505e7e60ccff"} Apr 20 20:44:14.564534 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:14.564484 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" event={"ID":"75e87178-7bd1-427a-b037-cac46cca9f3f","Type":"ContainerStarted","Data":"813dd37f621a516745a4bfa838545301ed5d324295a1bfe670499b26e1f3f5be"} Apr 20 20:44:14.564990 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:14.564716 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:14.564990 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:14.564746 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:14.566362 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:14.566337 2574 generic.go:358] "Generic (PLEG): container finished" podID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerID="845f3c0f91bf996160e0e3059285af83d8ccf92a99d3a5e3e746fe75a7322336" exitCode=0 Apr 20 20:44:14.566449 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:14.566404 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" event={"ID":"7bbbaa06-fdd4-4d95-b0be-29283f8c708e","Type":"ContainerDied","Data":"845f3c0f91bf996160e0e3059285af83d8ccf92a99d3a5e3e746fe75a7322336"} Apr 20 20:44:14.582157 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:14.582107 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" podStartSLOduration=6.582089857 podStartE2EDuration="6.582089857s" podCreationTimestamp="2026-04-20 20:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:44:14.580975845 +0000 UTC m=+2309.690031552" watchObservedRunningTime="2026-04-20 20:44:14.582089857 +0000 UTC m=+2309.691145566" Apr 20 20:44:14.893760 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:14.893738 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:44:15.003187 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.003150 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " Apr 20 20:44:15.003353 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.003211 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-proxy-tls\") pod \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " Apr 20 20:44:15.003353 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.003253 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kserve-provision-location\") pod \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " Apr 20 20:44:15.003444 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.003360 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkhvd\" (UniqueName: \"kubernetes.io/projected/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kube-api-access-wkhvd\") pod \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\" (UID: \"7bbbaa06-fdd4-4d95-b0be-29283f8c708e\") " Apr 20 20:44:15.003636 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.003610 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "7bbbaa06-fdd4-4d95-b0be-29283f8c708e" (UID: "7bbbaa06-fdd4-4d95-b0be-29283f8c708e"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:44:15.003708 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.003680 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7bbbaa06-fdd4-4d95-b0be-29283f8c708e" (UID: "7bbbaa06-fdd4-4d95-b0be-29283f8c708e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:44:15.005555 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.005527 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kube-api-access-wkhvd" (OuterVolumeSpecName: "kube-api-access-wkhvd") pod "7bbbaa06-fdd4-4d95-b0be-29283f8c708e" (UID: "7bbbaa06-fdd4-4d95-b0be-29283f8c708e"). InnerVolumeSpecName "kube-api-access-wkhvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:44:15.005687 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.005599 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7bbbaa06-fdd4-4d95-b0be-29283f8c708e" (UID: "7bbbaa06-fdd4-4d95-b0be-29283f8c708e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:44:15.104087 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.104044 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:44:15.104087 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.104088 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:44:15.104323 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.104105 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wkhvd\" (UniqueName: \"kubernetes.io/projected/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-kube-api-access-wkhvd\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:44:15.104323 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.104117 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7bbbaa06-fdd4-4d95-b0be-29283f8c708e-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:44:15.570763 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.570723 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" event={"ID":"7bbbaa06-fdd4-4d95-b0be-29283f8c708e","Type":"ContainerDied","Data":"b6672664fbc40787b3ef4bcd1e1a4b624a9b58b21c67fc018452541fdb5646f1"} Apr 20 20:44:15.570763 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.570753 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb" Apr 20 20:44:15.571277 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.570776 2574 scope.go:117] "RemoveContainer" containerID="7a141b630dc3612b0784e232c8d01edc5225f7ff694c76069e4ba5987b6242c9" Apr 20 20:44:15.579206 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.579188 2574 scope.go:117] "RemoveContainer" containerID="845f3c0f91bf996160e0e3059285af83d8ccf92a99d3a5e3e746fe75a7322336" Apr 20 20:44:15.586939 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.586913 2574 scope.go:117] "RemoveContainer" containerID="23aa84d354e2e76604037b9891d5318cc0ce3cf585077634a84988042ab4dd40" Apr 20 20:44:15.588608 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.588567 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb"] Apr 20 20:44:15.590208 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:15.590185 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-mzsrb"] Apr 20 20:44:17.483299 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:17.483250 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" path="/var/lib/kubelet/pods/7bbbaa06-fdd4-4d95-b0be-29283f8c708e/volumes" Apr 20 20:44:20.576527 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:20.576494 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:44:50.577635 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:44:50.577596 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 20 20:45:00.577860 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:00.577818 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 20 20:45:10.577350 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:10.577295 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 20 20:45:20.577546 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:20.577507 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 20 20:45:30.580714 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:30.580679 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:45:38.388024 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.387980 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5"] Apr 20 20:45:38.389006 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.388938 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kserve-container" containerID="cri-o://813dd37f621a516745a4bfa838545301ed5d324295a1bfe670499b26e1f3f5be" gracePeriod=30 Apr 20 20:45:38.389158 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.388985 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kube-rbac-proxy" containerID="cri-o://814ef44bd4c5e85f3736b22b5c542bc48bc8d9b31bd43c9f0a3b505e7e60ccff" gracePeriod=30 Apr 20 20:45:38.486277 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.486230 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz"] Apr 20 20:45:38.486685 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.486667 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kube-rbac-proxy" Apr 20 20:45:38.486750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.486687 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kube-rbac-proxy" Apr 20 20:45:38.486750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.486698 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" Apr 20 20:45:38.486750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.486703 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" Apr 20 20:45:38.486750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.486713 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="storage-initializer" Apr 20 20:45:38.486750 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.486721 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="storage-initializer" Apr 20 20:45:38.486927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.486799 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kube-rbac-proxy" Apr 20 20:45:38.486927 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.486808 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bbbaa06-fdd4-4d95-b0be-29283f8c708e" containerName="kserve-container" Apr 20 20:45:38.490028 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.490001 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.492114 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.492090 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 20 20:45:38.492222 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.492120 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 20 20:45:38.500533 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.500512 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz"] Apr 20 20:45:38.595173 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.595139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33896813-d990-4343-8b09-55a064b81ba3-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.595353 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.595227 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx2kf\" (UniqueName: \"kubernetes.io/projected/33896813-d990-4343-8b09-55a064b81ba3-kube-api-access-rx2kf\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.595353 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.595275 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33896813-d990-4343-8b09-55a064b81ba3-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.595353 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.595328 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33896813-d990-4343-8b09-55a064b81ba3-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.696555 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.696527 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33896813-d990-4343-8b09-55a064b81ba3-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.696711 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.696582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx2kf\" (UniqueName: \"kubernetes.io/projected/33896813-d990-4343-8b09-55a064b81ba3-kube-api-access-rx2kf\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.696711 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.696603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33896813-d990-4343-8b09-55a064b81ba3-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.696711 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.696624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33896813-d990-4343-8b09-55a064b81ba3-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.696711 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:45:38.696695 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-serving-cert: secret "isvc-predictive-xgboost-v2-predictor-serving-cert" not found Apr 20 20:45:38.696911 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:45:38.696764 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33896813-d990-4343-8b09-55a064b81ba3-proxy-tls podName:33896813-d990-4343-8b09-55a064b81ba3 nodeName:}" failed. No retries permitted until 2026-04-20 20:45:39.196742677 +0000 UTC m=+2394.305798365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/33896813-d990-4343-8b09-55a064b81ba3-proxy-tls") pod "isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" (UID: "33896813-d990-4343-8b09-55a064b81ba3") : secret "isvc-predictive-xgboost-v2-predictor-serving-cert" not found Apr 20 20:45:38.696955 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.696933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33896813-d990-4343-8b09-55a064b81ba3-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.697240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.697221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33896813-d990-4343-8b09-55a064b81ba3-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.704813 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.704784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx2kf\" (UniqueName: \"kubernetes.io/projected/33896813-d990-4343-8b09-55a064b81ba3-kube-api-access-rx2kf\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:38.828804 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.828771 2574 generic.go:358] "Generic (PLEG): container finished" podID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerID="814ef44bd4c5e85f3736b22b5c542bc48bc8d9b31bd43c9f0a3b505e7e60ccff" exitCode=2 Apr 20 20:45:38.828949 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:38.828837 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" event={"ID":"75e87178-7bd1-427a-b037-cac46cca9f3f","Type":"ContainerDied","Data":"814ef44bd4c5e85f3736b22b5c542bc48bc8d9b31bd43c9f0a3b505e7e60ccff"} Apr 20 20:45:39.201146 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:39.201113 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33896813-d990-4343-8b09-55a064b81ba3-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:39.203603 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:39.203583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33896813-d990-4343-8b09-55a064b81ba3-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:39.402926 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:39.402893 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:39.530373 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:39.530314 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz"] Apr 20 20:45:39.533081 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:45:39.533048 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33896813_d990_4343_8b09_55a064b81ba3.slice/crio-94e464b6f0029eb02ae5bdd2b17b2cabc987c56d064e68b4af43067bb606bdbd WatchSource:0}: Error finding container 94e464b6f0029eb02ae5bdd2b17b2cabc987c56d064e68b4af43067bb606bdbd: Status 404 returned error can't find the container with id 94e464b6f0029eb02ae5bdd2b17b2cabc987c56d064e68b4af43067bb606bdbd Apr 20 20:45:39.535299 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:39.535255 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:45:39.833695 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:39.833607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" event={"ID":"33896813-d990-4343-8b09-55a064b81ba3","Type":"ContainerStarted","Data":"66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf"} Apr 20 20:45:39.833695 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:39.833639 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" event={"ID":"33896813-d990-4343-8b09-55a064b81ba3","Type":"ContainerStarted","Data":"94e464b6f0029eb02ae5bdd2b17b2cabc987c56d064e68b4af43067bb606bdbd"} Apr 20 20:45:40.572483 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:40.572441 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 20 20:45:40.577923 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:40.577892 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 20 20:45:42.844448 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:42.844415 2574 generic.go:358] "Generic (PLEG): container finished" podID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerID="813dd37f621a516745a4bfa838545301ed5d324295a1bfe670499b26e1f3f5be" exitCode=0 Apr 20 20:45:42.844782 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:42.844495 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" event={"ID":"75e87178-7bd1-427a-b037-cac46cca9f3f","Type":"ContainerDied","Data":"813dd37f621a516745a4bfa838545301ed5d324295a1bfe670499b26e1f3f5be"} Apr 20 20:45:42.923752 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:42.923730 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:45:43.031908 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.031869 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75e87178-7bd1-427a-b037-cac46cca9f3f-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"75e87178-7bd1-427a-b037-cac46cca9f3f\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " Apr 20 20:45:43.032076 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.031929 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp2h5\" (UniqueName: \"kubernetes.io/projected/75e87178-7bd1-427a-b037-cac46cca9f3f-kube-api-access-sp2h5\") pod \"75e87178-7bd1-427a-b037-cac46cca9f3f\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " Apr 20 20:45:43.032076 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.031959 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75e87178-7bd1-427a-b037-cac46cca9f3f-proxy-tls\") pod \"75e87178-7bd1-427a-b037-cac46cca9f3f\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " Apr 20 20:45:43.032076 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.031977 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75e87178-7bd1-427a-b037-cac46cca9f3f-kserve-provision-location\") pod \"75e87178-7bd1-427a-b037-cac46cca9f3f\" (UID: \"75e87178-7bd1-427a-b037-cac46cca9f3f\") " Apr 20 20:45:43.032249 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.032216 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e87178-7bd1-427a-b037-cac46cca9f3f-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "75e87178-7bd1-427a-b037-cac46cca9f3f" (UID: "75e87178-7bd1-427a-b037-cac46cca9f3f"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:45:43.032401 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.032381 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e87178-7bd1-427a-b037-cac46cca9f3f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "75e87178-7bd1-427a-b037-cac46cca9f3f" (UID: "75e87178-7bd1-427a-b037-cac46cca9f3f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:45:43.034178 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.034128 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e87178-7bd1-427a-b037-cac46cca9f3f-kube-api-access-sp2h5" (OuterVolumeSpecName: "kube-api-access-sp2h5") pod "75e87178-7bd1-427a-b037-cac46cca9f3f" (UID: "75e87178-7bd1-427a-b037-cac46cca9f3f"). InnerVolumeSpecName "kube-api-access-sp2h5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:45:43.034178 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.034157 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e87178-7bd1-427a-b037-cac46cca9f3f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "75e87178-7bd1-427a-b037-cac46cca9f3f" (UID: "75e87178-7bd1-427a-b037-cac46cca9f3f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:45:43.132629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.132595 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75e87178-7bd1-427a-b037-cac46cca9f3f-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:45:43.132629 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.132630 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sp2h5\" (UniqueName: \"kubernetes.io/projected/75e87178-7bd1-427a-b037-cac46cca9f3f-kube-api-access-sp2h5\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:45:43.132825 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.132645 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75e87178-7bd1-427a-b037-cac46cca9f3f-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:45:43.132825 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.132661 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75e87178-7bd1-427a-b037-cac46cca9f3f-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:45:43.848423 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.848381 2574 generic.go:358] "Generic (PLEG): container finished" podID="33896813-d990-4343-8b09-55a064b81ba3" containerID="66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf" exitCode=0 Apr 20 20:45:43.848753 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.848452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" event={"ID":"33896813-d990-4343-8b09-55a064b81ba3","Type":"ContainerDied","Data":"66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf"} Apr 20 20:45:43.850324 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.850299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" event={"ID":"75e87178-7bd1-427a-b037-cac46cca9f3f","Type":"ContainerDied","Data":"0cb8973a98ddd8b0b3db93c31c3fc3ef3ee55ecb3cee753114098994681a9311"} Apr 20 20:45:43.850440 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.850342 2574 scope.go:117] "RemoveContainer" containerID="814ef44bd4c5e85f3736b22b5c542bc48bc8d9b31bd43c9f0a3b505e7e60ccff" Apr 20 20:45:43.850440 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.850353 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5" Apr 20 20:45:43.858442 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.858426 2574 scope.go:117] "RemoveContainer" containerID="813dd37f621a516745a4bfa838545301ed5d324295a1bfe670499b26e1f3f5be" Apr 20 20:45:43.868424 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.868391 2574 scope.go:117] "RemoveContainer" containerID="525da373b440a31071d356123793d6d1219464aaf74ed30993889c296195b586" Apr 20 20:45:43.881379 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.881360 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5"] Apr 20 20:45:43.884910 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:43.884889 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-cfmz5"] Apr 20 20:45:44.856343 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:44.856305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" event={"ID":"33896813-d990-4343-8b09-55a064b81ba3","Type":"ContainerStarted","Data":"34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa"} Apr 20 20:45:44.856343 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:44.856344 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" event={"ID":"33896813-d990-4343-8b09-55a064b81ba3","Type":"ContainerStarted","Data":"57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f"} Apr 20 20:45:44.856740 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:44.856585 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:44.856740 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:44.856640 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:45:44.877943 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:44.877897 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" podStartSLOduration=6.877884615 podStartE2EDuration="6.877884615s" podCreationTimestamp="2026-04-20 20:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:45:44.875578864 +0000 UTC m=+2399.984634581" watchObservedRunningTime="2026-04-20 20:45:44.877884615 +0000 UTC m=+2399.986940316" Apr 20 20:45:45.482924 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:45.482857 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" path="/var/lib/kubelet/pods/75e87178-7bd1-427a-b037-cac46cca9f3f/volumes" Apr 20 20:45:48.432245 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:48.432217 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:45:48.434496 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:48.434473 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:45:48.438714 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:48.438696 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:45:48.440570 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:48.440547 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:45:50.864537 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:45:50.864462 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:46:20.865710 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:46:20.865665 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.35:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.35:8080: connect: connection refused" Apr 20 20:46:30.865277 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:46:30.865223 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.35:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.35:8080: connect: connection refused" Apr 20 20:46:40.865558 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:46:40.865516 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.35:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.35:8080: connect: connection refused" Apr 20 20:46:50.865613 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:46:50.865571 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.35:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.35:8080: connect: connection refused" Apr 20 20:46:58.482144 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:46:58.482111 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:47:08.603245 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.603212 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz"] Apr 20 20:47:08.603782 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.603751 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kserve-container" containerID="cri-o://57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f" gracePeriod=30 Apr 20 20:47:08.603947 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.603901 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kube-rbac-proxy" containerID="cri-o://34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa" gracePeriod=30 Apr 20 20:47:08.704231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.704195 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv"] Apr 20 20:47:08.704562 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.704550 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kserve-container" Apr 20 20:47:08.704562 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.704563 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kserve-container" Apr 20 20:47:08.704653 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.704572 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kube-rbac-proxy" Apr 20 20:47:08.704653 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.704578 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kube-rbac-proxy" Apr 20 20:47:08.704653 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.704584 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="storage-initializer" Apr 20 20:47:08.704653 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.704590 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="storage-initializer" Apr 20 20:47:08.704653 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.704649 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kube-rbac-proxy" Apr 20 20:47:08.704831 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.704663 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75e87178-7bd1-427a-b037-cac46cca9f3f" containerName="kserve-container" Apr 20 20:47:08.707808 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.707793 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.710108 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.710083 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 20 20:47:08.710108 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.710095 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 20 20:47:08.715776 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.715726 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv"] Apr 20 20:47:08.844922 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.844880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7f7d20c-da73-41a9-9168-87119a84f4be-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.844922 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.844917 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8tfj\" (UniqueName: \"kubernetes.io/projected/d7f7d20c-da73-41a9-9168-87119a84f4be-kube-api-access-j8tfj\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.845156 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.844989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f7d20c-da73-41a9-9168-87119a84f4be-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.845156 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.845028 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7f7d20c-da73-41a9-9168-87119a84f4be-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.945672 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.945639 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f7d20c-da73-41a9-9168-87119a84f4be-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.945878 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.945718 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7f7d20c-da73-41a9-9168-87119a84f4be-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.945878 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.945778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7f7d20c-da73-41a9-9168-87119a84f4be-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.945878 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.945805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8tfj\" (UniqueName: \"kubernetes.io/projected/d7f7d20c-da73-41a9-9168-87119a84f4be-kube-api-access-j8tfj\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.946123 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.946095 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f7d20c-da73-41a9-9168-87119a84f4be-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.946426 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.946404 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7f7d20c-da73-41a9-9168-87119a84f4be-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.948381 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.948359 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7f7d20c-da73-41a9-9168-87119a84f4be-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:08.953795 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:08.953773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8tfj\" (UniqueName: \"kubernetes.io/projected/d7f7d20c-da73-41a9-9168-87119a84f4be-kube-api-access-j8tfj\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:09.020540 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:09.020505 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:09.106406 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:09.106374 2574 generic.go:358] "Generic (PLEG): container finished" podID="33896813-d990-4343-8b09-55a064b81ba3" containerID="34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa" exitCode=2 Apr 20 20:47:09.106406 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:09.106397 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" event={"ID":"33896813-d990-4343-8b09-55a064b81ba3","Type":"ContainerDied","Data":"34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa"} Apr 20 20:47:09.150624 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:09.150601 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv"] Apr 20 20:47:09.152655 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:47:09.152630 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7f7d20c_da73_41a9_9168_87119a84f4be.slice/crio-d4cf8a692e41f40daa5dcb78f884581f6ba583e6527488517f8e97d20972544d WatchSource:0}: Error finding container d4cf8a692e41f40daa5dcb78f884581f6ba583e6527488517f8e97d20972544d: Status 404 returned error can't find the container with id d4cf8a692e41f40daa5dcb78f884581f6ba583e6527488517f8e97d20972544d Apr 20 20:47:10.110954 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:10.110918 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" event={"ID":"d7f7d20c-da73-41a9-9168-87119a84f4be","Type":"ContainerStarted","Data":"ead22034a66a279c55935781da0fbc8e29ce4539067e00fc73494894f66d3486"} Apr 20 20:47:10.110954 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:10.110955 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" event={"ID":"d7f7d20c-da73-41a9-9168-87119a84f4be","Type":"ContainerStarted","Data":"d4cf8a692e41f40daa5dcb78f884581f6ba583e6527488517f8e97d20972544d"} Apr 20 20:47:10.860280 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:10.860234 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.35:8643/healthz\": dial tcp 10.134.0.35:8643: connect: connection refused" Apr 20 20:47:13.121240 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.121206 2574 generic.go:358] "Generic (PLEG): container finished" podID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerID="ead22034a66a279c55935781da0fbc8e29ce4539067e00fc73494894f66d3486" exitCode=0 Apr 20 20:47:13.121669 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.121308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" event={"ID":"d7f7d20c-da73-41a9-9168-87119a84f4be","Type":"ContainerDied","Data":"ead22034a66a279c55935781da0fbc8e29ce4539067e00fc73494894f66d3486"} Apr 20 20:47:13.638076 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.638052 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:47:13.789182 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.789151 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33896813-d990-4343-8b09-55a064b81ba3-kserve-provision-location\") pod \"33896813-d990-4343-8b09-55a064b81ba3\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " Apr 20 20:47:13.789371 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.789211 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33896813-d990-4343-8b09-55a064b81ba3-proxy-tls\") pod \"33896813-d990-4343-8b09-55a064b81ba3\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " Apr 20 20:47:13.789371 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.789303 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx2kf\" (UniqueName: \"kubernetes.io/projected/33896813-d990-4343-8b09-55a064b81ba3-kube-api-access-rx2kf\") pod \"33896813-d990-4343-8b09-55a064b81ba3\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " Apr 20 20:47:13.789371 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.789358 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33896813-d990-4343-8b09-55a064b81ba3-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"33896813-d990-4343-8b09-55a064b81ba3\" (UID: \"33896813-d990-4343-8b09-55a064b81ba3\") " Apr 20 20:47:13.789557 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.789536 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33896813-d990-4343-8b09-55a064b81ba3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "33896813-d990-4343-8b09-55a064b81ba3" (UID: "33896813-d990-4343-8b09-55a064b81ba3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:47:13.789626 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.789599 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33896813-d990-4343-8b09-55a064b81ba3-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:47:13.789834 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.789802 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33896813-d990-4343-8b09-55a064b81ba3-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "33896813-d990-4343-8b09-55a064b81ba3" (UID: "33896813-d990-4343-8b09-55a064b81ba3"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:47:13.791450 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.791421 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33896813-d990-4343-8b09-55a064b81ba3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "33896813-d990-4343-8b09-55a064b81ba3" (UID: "33896813-d990-4343-8b09-55a064b81ba3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:47:13.791578 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.791499 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33896813-d990-4343-8b09-55a064b81ba3-kube-api-access-rx2kf" (OuterVolumeSpecName: "kube-api-access-rx2kf") pod "33896813-d990-4343-8b09-55a064b81ba3" (UID: "33896813-d990-4343-8b09-55a064b81ba3"). InnerVolumeSpecName "kube-api-access-rx2kf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:47:13.890464 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.890373 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33896813-d990-4343-8b09-55a064b81ba3-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:47:13.890464 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.890404 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rx2kf\" (UniqueName: \"kubernetes.io/projected/33896813-d990-4343-8b09-55a064b81ba3-kube-api-access-rx2kf\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:47:13.890464 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:13.890423 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33896813-d990-4343-8b09-55a064b81ba3-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:47:14.125668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.125630 2574 generic.go:358] "Generic (PLEG): container finished" podID="33896813-d990-4343-8b09-55a064b81ba3" containerID="57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f" exitCode=0 Apr 20 20:47:14.126186 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.125722 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" Apr 20 20:47:14.126186 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.125717 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" event={"ID":"33896813-d990-4343-8b09-55a064b81ba3","Type":"ContainerDied","Data":"57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f"} Apr 20 20:47:14.126186 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.125816 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz" event={"ID":"33896813-d990-4343-8b09-55a064b81ba3","Type":"ContainerDied","Data":"94e464b6f0029eb02ae5bdd2b17b2cabc987c56d064e68b4af43067bb606bdbd"} Apr 20 20:47:14.126186 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.125837 2574 scope.go:117] "RemoveContainer" containerID="34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa" Apr 20 20:47:14.127870 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.127849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" event={"ID":"d7f7d20c-da73-41a9-9168-87119a84f4be","Type":"ContainerStarted","Data":"aab5d26cd24aa412ceb3e7f6548efa01c97cfc15834873163703f5c3a024a605"} Apr 20 20:47:14.127979 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.127875 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" event={"ID":"d7f7d20c-da73-41a9-9168-87119a84f4be","Type":"ContainerStarted","Data":"7e160d8d819e164eb6efdacae8aa3f528ea6c017088958bb36ef5415eb025894"} Apr 20 20:47:14.128099 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.128079 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:14.128164 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.128112 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:14.134317 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.134286 2574 scope.go:117] "RemoveContainer" containerID="57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f" Apr 20 20:47:14.141444 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.141429 2574 scope.go:117] "RemoveContainer" containerID="66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf" Apr 20 20:47:14.146583 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.146547 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" podStartSLOduration=6.146535823 podStartE2EDuration="6.146535823s" podCreationTimestamp="2026-04-20 20:47:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:47:14.144860928 +0000 UTC m=+2489.253916634" watchObservedRunningTime="2026-04-20 20:47:14.146535823 +0000 UTC m=+2489.255591585" Apr 20 20:47:14.149231 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.149214 2574 scope.go:117] "RemoveContainer" containerID="34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa" Apr 20 20:47:14.149501 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:47:14.149482 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa\": container with ID starting with 34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa not found: ID does not exist" containerID="34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa" Apr 20 20:47:14.149561 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.149508 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa"} err="failed to get container status \"34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa\": rpc error: code = NotFound desc = could not find container \"34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa\": container with ID starting with 34740d940eb9f908e45975de599e12e94809ed0c14eb254ce31307acce4761fa not found: ID does not exist" Apr 20 20:47:14.149561 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.149523 2574 scope.go:117] "RemoveContainer" containerID="57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f" Apr 20 20:47:14.149753 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:47:14.149734 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f\": container with ID starting with 57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f not found: ID does not exist" containerID="57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f" Apr 20 20:47:14.149806 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.149759 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f"} err="failed to get container status \"57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f\": rpc error: code = NotFound desc = could not find container \"57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f\": container with ID starting with 57d6ac66d11cded74e2ed67ecdfb2bc0150603d18a6fa8580ce080a0b4e96a6f not found: ID does not exist" Apr 20 20:47:14.149806 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.149776 2574 scope.go:117] "RemoveContainer" containerID="66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf" Apr 20 20:47:14.149972 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:47:14.149953 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf\": container with ID starting with 66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf not found: ID does not exist" containerID="66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf" Apr 20 20:47:14.150026 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.149981 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf"} err="failed to get container status \"66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf\": rpc error: code = NotFound desc = could not find container \"66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf\": container with ID starting with 66ef5a634a7322c251b58e8d9e8d53db5dcbec4421261c891869ea1cf49e53cf not found: ID does not exist" Apr 20 20:47:14.157537 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.157517 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz"] Apr 20 20:47:14.160451 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:14.160433 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-wbdxz"] Apr 20 20:47:15.483391 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:15.483350 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33896813-d990-4343-8b09-55a064b81ba3" path="/var/lib/kubelet/pods/33896813-d990-4343-8b09-55a064b81ba3/volumes" Apr 20 20:47:20.142482 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:20.142454 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:47:50.143599 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:47:50.143557 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.36:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.36:8080: connect: connection refused" Apr 20 20:48:00.143432 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:00.143396 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.36:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.36:8080: connect: connection refused" Apr 20 20:48:10.143502 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:10.143460 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.36:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.36:8080: connect: connection refused" Apr 20 20:48:20.143958 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:20.143912 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.36:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.36:8080: connect: connection refused" Apr 20 20:48:30.146407 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:30.146377 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:48:38.793553 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:38.793513 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv"] Apr 20 20:48:38.796288 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:38.794655 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kserve-container" containerID="cri-o://7e160d8d819e164eb6efdacae8aa3f528ea6c017088958bb36ef5415eb025894" gracePeriod=30 Apr 20 20:48:38.796288 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:38.794987 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kube-rbac-proxy" containerID="cri-o://aab5d26cd24aa412ceb3e7f6548efa01c97cfc15834873163703f5c3a024a605" gracePeriod=30 Apr 20 20:48:39.401639 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:39.401602 2574 generic.go:358] "Generic (PLEG): container finished" podID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerID="aab5d26cd24aa412ceb3e7f6548efa01c97cfc15834873163703f5c3a024a605" exitCode=2 Apr 20 20:48:39.401830 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:39.401678 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" event={"ID":"d7f7d20c-da73-41a9-9168-87119a84f4be","Type":"ContainerDied","Data":"aab5d26cd24aa412ceb3e7f6548efa01c97cfc15834873163703f5c3a024a605"} Apr 20 20:48:40.137440 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:40.137392 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 20 20:48:40.143530 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:40.143492 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.36:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.36:8080: connect: connection refused" Apr 20 20:48:44.418332 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.418296 2574 generic.go:358] "Generic (PLEG): container finished" podID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerID="7e160d8d819e164eb6efdacae8aa3f528ea6c017088958bb36ef5415eb025894" exitCode=0 Apr 20 20:48:44.418719 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.418367 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" event={"ID":"d7f7d20c-da73-41a9-9168-87119a84f4be","Type":"ContainerDied","Data":"7e160d8d819e164eb6efdacae8aa3f528ea6c017088958bb36ef5415eb025894"} Apr 20 20:48:44.640645 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.640619 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:48:44.718522 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.718481 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f7d20c-da73-41a9-9168-87119a84f4be-kserve-provision-location\") pod \"d7f7d20c-da73-41a9-9168-87119a84f4be\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " Apr 20 20:48:44.718522 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.718531 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7f7d20c-da73-41a9-9168-87119a84f4be-proxy-tls\") pod \"d7f7d20c-da73-41a9-9168-87119a84f4be\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " Apr 20 20:48:44.718739 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.718564 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7f7d20c-da73-41a9-9168-87119a84f4be-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"d7f7d20c-da73-41a9-9168-87119a84f4be\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " Apr 20 20:48:44.718739 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.718584 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8tfj\" (UniqueName: \"kubernetes.io/projected/d7f7d20c-da73-41a9-9168-87119a84f4be-kube-api-access-j8tfj\") pod \"d7f7d20c-da73-41a9-9168-87119a84f4be\" (UID: \"d7f7d20c-da73-41a9-9168-87119a84f4be\") " Apr 20 20:48:44.718951 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.718924 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f7d20c-da73-41a9-9168-87119a84f4be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d7f7d20c-da73-41a9-9168-87119a84f4be" (UID: "d7f7d20c-da73-41a9-9168-87119a84f4be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:48:44.718989 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.718953 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f7d20c-da73-41a9-9168-87119a84f4be-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "d7f7d20c-da73-41a9-9168-87119a84f4be" (UID: "d7f7d20c-da73-41a9-9168-87119a84f4be"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:48:44.720958 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.720934 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f7d20c-da73-41a9-9168-87119a84f4be-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d7f7d20c-da73-41a9-9168-87119a84f4be" (UID: "d7f7d20c-da73-41a9-9168-87119a84f4be"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:48:44.721024 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.720971 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f7d20c-da73-41a9-9168-87119a84f4be-kube-api-access-j8tfj" (OuterVolumeSpecName: "kube-api-access-j8tfj") pod "d7f7d20c-da73-41a9-9168-87119a84f4be" (UID: "d7f7d20c-da73-41a9-9168-87119a84f4be"). InnerVolumeSpecName "kube-api-access-j8tfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:48:44.820012 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.819928 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f7d20c-da73-41a9-9168-87119a84f4be-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:48:44.820012 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.819959 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7f7d20c-da73-41a9-9168-87119a84f4be-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:48:44.820012 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.819972 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d7f7d20c-da73-41a9-9168-87119a84f4be-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:48:44.820012 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:44.819982 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8tfj\" (UniqueName: \"kubernetes.io/projected/d7f7d20c-da73-41a9-9168-87119a84f4be-kube-api-access-j8tfj\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:48:45.423252 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:45.423218 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" event={"ID":"d7f7d20c-da73-41a9-9168-87119a84f4be","Type":"ContainerDied","Data":"d4cf8a692e41f40daa5dcb78f884581f6ba583e6527488517f8e97d20972544d"} Apr 20 20:48:45.423252 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:45.423239 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv" Apr 20 20:48:45.423769 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:45.423288 2574 scope.go:117] "RemoveContainer" containerID="aab5d26cd24aa412ceb3e7f6548efa01c97cfc15834873163703f5c3a024a605" Apr 20 20:48:45.431508 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:45.431428 2574 scope.go:117] "RemoveContainer" containerID="7e160d8d819e164eb6efdacae8aa3f528ea6c017088958bb36ef5415eb025894" Apr 20 20:48:45.438832 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:45.438813 2574 scope.go:117] "RemoveContainer" containerID="ead22034a66a279c55935781da0fbc8e29ce4539067e00fc73494894f66d3486" Apr 20 20:48:45.446501 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:45.446476 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv"] Apr 20 20:48:45.449022 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:45.448997 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hrdhv"] Apr 20 20:48:45.487754 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:48:45.487718 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" path="/var/lib/kubelet/pods/d7f7d20c-da73-41a9-9168-87119a84f4be/volumes" Apr 20 20:50:48.455507 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:50:48.455431 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:50:48.458988 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:50:48.458962 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:50:48.461835 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:50:48.461813 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:50:48.464931 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:50:48.464911 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:55:19.200507 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.200468 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44"] Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.200941 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kserve-container" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.200957 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kserve-container" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.200968 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kube-rbac-proxy" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.200973 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kube-rbac-proxy" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.200984 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kube-rbac-proxy" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.200990 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kube-rbac-proxy" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.200999 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="storage-initializer" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.201004 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="storage-initializer" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.201010 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kserve-container" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.201015 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kserve-container" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.201028 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="storage-initializer" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.201033 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="storage-initializer" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.201088 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kube-rbac-proxy" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.201096 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kserve-container" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.201103 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="33896813-d990-4343-8b09-55a064b81ba3" containerName="kserve-container" Apr 20 20:55:19.202909 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.201113 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f7d20c-da73-41a9-9168-87119a84f4be" containerName="kube-rbac-proxy" Apr 20 20:55:19.204071 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.204050 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.206476 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.206452 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 20 20:55:19.206476 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.206473 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:55:19.206668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.206473 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:55:19.206668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.206455 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:55:19.206668 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.206474 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 20 20:55:19.214364 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.214345 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44"] Apr 20 20:55:19.285650 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.285627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467ea2f4-7e8b-46de-8a70-99185fa72d83-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.285758 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.285659 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/467ea2f4-7e8b-46de-8a70-99185fa72d83-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.285758 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.285721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjjlx\" (UniqueName: \"kubernetes.io/projected/467ea2f4-7e8b-46de-8a70-99185fa72d83-kube-api-access-gjjlx\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.285758 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.285751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/467ea2f4-7e8b-46de-8a70-99185fa72d83-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.386496 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.386469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjjlx\" (UniqueName: \"kubernetes.io/projected/467ea2f4-7e8b-46de-8a70-99185fa72d83-kube-api-access-gjjlx\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.386666 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.386504 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/467ea2f4-7e8b-46de-8a70-99185fa72d83-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.386666 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.386571 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467ea2f4-7e8b-46de-8a70-99185fa72d83-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.386666 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.386589 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/467ea2f4-7e8b-46de-8a70-99185fa72d83-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.386964 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.386943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/467ea2f4-7e8b-46de-8a70-99185fa72d83-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.387201 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.387183 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/467ea2f4-7e8b-46de-8a70-99185fa72d83-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.389083 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.389063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467ea2f4-7e8b-46de-8a70-99185fa72d83-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.394075 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.394053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjjlx\" (UniqueName: \"kubernetes.io/projected/467ea2f4-7e8b-46de-8a70-99185fa72d83-kube-api-access-gjjlx\") pod \"isvc-tensorflow-predictor-6756f669d7-6wj44\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.515148 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.515082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:19.639548 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.639525 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44"] Apr 20 20:55:19.642139 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:55:19.642112 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467ea2f4_7e8b_46de_8a70_99185fa72d83.slice/crio-2450dfda460cb44e17044a5979cbaff051748e41bf898a50025bdbc372996df1 WatchSource:0}: Error finding container 2450dfda460cb44e17044a5979cbaff051748e41bf898a50025bdbc372996df1: Status 404 returned error can't find the container with id 2450dfda460cb44e17044a5979cbaff051748e41bf898a50025bdbc372996df1 Apr 20 20:55:19.643989 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:19.643974 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:55:20.614380 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:20.614345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" event={"ID":"467ea2f4-7e8b-46de-8a70-99185fa72d83","Type":"ContainerStarted","Data":"1677ae91b9423f3f0c2416d72088d14bd8464244ca189f645b8affccf9c537e3"} Apr 20 20:55:20.614380 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:20.614386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" event={"ID":"467ea2f4-7e8b-46de-8a70-99185fa72d83","Type":"ContainerStarted","Data":"2450dfda460cb44e17044a5979cbaff051748e41bf898a50025bdbc372996df1"} Apr 20 20:55:24.627742 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:24.627708 2574 generic.go:358] "Generic (PLEG): container finished" podID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerID="1677ae91b9423f3f0c2416d72088d14bd8464244ca189f645b8affccf9c537e3" exitCode=0 Apr 20 20:55:24.628134 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:24.627755 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" event={"ID":"467ea2f4-7e8b-46de-8a70-99185fa72d83","Type":"ContainerDied","Data":"1677ae91b9423f3f0c2416d72088d14bd8464244ca189f645b8affccf9c537e3"} Apr 20 20:55:28.643864 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:28.643777 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" event={"ID":"467ea2f4-7e8b-46de-8a70-99185fa72d83","Type":"ContainerStarted","Data":"e12c9cde98df5f678b9120ed67dd095872d2e29249918edf242dc0dc36f288ec"} Apr 20 20:55:28.643864 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:28.643818 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" event={"ID":"467ea2f4-7e8b-46de-8a70-99185fa72d83","Type":"ContainerStarted","Data":"ce2a69bba9741776fc944083a697dd756c7a47aa868f6de9295511b8d38169af"} Apr 20 20:55:28.644341 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:28.644115 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:28.649700 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:28.644607 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:28.650169 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:28.650132 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 20 20:55:28.667694 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:28.667639 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podStartSLOduration=5.995768151 podStartE2EDuration="9.667625861s" podCreationTimestamp="2026-04-20 20:55:19 +0000 UTC" firstStartedPulling="2026-04-20 20:55:24.6289346 +0000 UTC m=+2979.737990286" lastFinishedPulling="2026-04-20 20:55:28.300792296 +0000 UTC m=+2983.409847996" observedRunningTime="2026-04-20 20:55:28.665201041 +0000 UTC m=+2983.774256760" watchObservedRunningTime="2026-04-20 20:55:28.667625861 +0000 UTC m=+2983.776681568" Apr 20 20:55:29.646941 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:29.646900 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 20 20:55:34.650893 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:34.650867 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:34.651352 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:34.651327 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 20 20:55:44.651543 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:44.651517 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:55:48.485829 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:48.485798 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:55:48.489235 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:48.489212 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 20:55:48.492131 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:48.492108 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:55:48.495440 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:55:48.495425 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 20:56:00.530053 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:00.530020 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44"] Apr 20 20:56:00.530700 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:00.530365 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kserve-container" containerID="cri-o://ce2a69bba9741776fc944083a697dd756c7a47aa868f6de9295511b8d38169af" gracePeriod=30 Apr 20 20:56:00.530700 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:00.530420 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" containerID="cri-o://e12c9cde98df5f678b9120ed67dd095872d2e29249918edf242dc0dc36f288ec" gracePeriod=30 Apr 20 20:56:00.743833 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:00.743803 2574 generic.go:358] "Generic (PLEG): container finished" podID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerID="e12c9cde98df5f678b9120ed67dd095872d2e29249918edf242dc0dc36f288ec" exitCode=2 Apr 20 20:56:00.743990 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:00.743866 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" event={"ID":"467ea2f4-7e8b-46de-8a70-99185fa72d83","Type":"ContainerDied","Data":"e12c9cde98df5f678b9120ed67dd095872d2e29249918edf242dc0dc36f288ec"} Apr 20 20:56:04.647989 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:04.647928 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 20 20:56:09.647858 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:09.647818 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 20 20:56:14.648061 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:14.648018 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 20 20:56:14.648445 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:14.648139 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:56:19.648186 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:19.648147 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 20 20:56:24.647543 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:24.647500 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 20 20:56:29.647556 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:29.647515 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 20 20:56:30.834164 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:30.834075 2574 generic.go:358] "Generic (PLEG): container finished" podID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerID="ce2a69bba9741776fc944083a697dd756c7a47aa868f6de9295511b8d38169af" exitCode=137 Apr 20 20:56:30.834164 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:30.834154 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" event={"ID":"467ea2f4-7e8b-46de-8a70-99185fa72d83","Type":"ContainerDied","Data":"ce2a69bba9741776fc944083a697dd756c7a47aa868f6de9295511b8d38169af"} Apr 20 20:56:31.175814 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.175789 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:56:31.270953 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.270922 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjjlx\" (UniqueName: \"kubernetes.io/projected/467ea2f4-7e8b-46de-8a70-99185fa72d83-kube-api-access-gjjlx\") pod \"467ea2f4-7e8b-46de-8a70-99185fa72d83\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " Apr 20 20:56:31.271111 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.271011 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/467ea2f4-7e8b-46de-8a70-99185fa72d83-kserve-provision-location\") pod \"467ea2f4-7e8b-46de-8a70-99185fa72d83\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " Apr 20 20:56:31.271111 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.271030 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467ea2f4-7e8b-46de-8a70-99185fa72d83-proxy-tls\") pod \"467ea2f4-7e8b-46de-8a70-99185fa72d83\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " Apr 20 20:56:31.271111 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.271061 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/467ea2f4-7e8b-46de-8a70-99185fa72d83-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"467ea2f4-7e8b-46de-8a70-99185fa72d83\" (UID: \"467ea2f4-7e8b-46de-8a70-99185fa72d83\") " Apr 20 20:56:31.271501 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.271465 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467ea2f4-7e8b-46de-8a70-99185fa72d83-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "467ea2f4-7e8b-46de-8a70-99185fa72d83" (UID: "467ea2f4-7e8b-46de-8a70-99185fa72d83"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:56:31.273149 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.273119 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/467ea2f4-7e8b-46de-8a70-99185fa72d83-kube-api-access-gjjlx" (OuterVolumeSpecName: "kube-api-access-gjjlx") pod "467ea2f4-7e8b-46de-8a70-99185fa72d83" (UID: "467ea2f4-7e8b-46de-8a70-99185fa72d83"). InnerVolumeSpecName "kube-api-access-gjjlx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:56:31.273248 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.273157 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467ea2f4-7e8b-46de-8a70-99185fa72d83-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "467ea2f4-7e8b-46de-8a70-99185fa72d83" (UID: "467ea2f4-7e8b-46de-8a70-99185fa72d83"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:56:31.281941 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.281912 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/467ea2f4-7e8b-46de-8a70-99185fa72d83-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "467ea2f4-7e8b-46de-8a70-99185fa72d83" (UID: "467ea2f4-7e8b-46de-8a70-99185fa72d83"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:56:31.372128 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.372058 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/467ea2f4-7e8b-46de-8a70-99185fa72d83-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:56:31.372128 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.372085 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467ea2f4-7e8b-46de-8a70-99185fa72d83-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:56:31.372128 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.372097 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/467ea2f4-7e8b-46de-8a70-99185fa72d83-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:56:31.372128 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.372107 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjjlx\" (UniqueName: \"kubernetes.io/projected/467ea2f4-7e8b-46de-8a70-99185fa72d83-kube-api-access-gjjlx\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:56:31.844210 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.844171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" event={"ID":"467ea2f4-7e8b-46de-8a70-99185fa72d83","Type":"ContainerDied","Data":"2450dfda460cb44e17044a5979cbaff051748e41bf898a50025bdbc372996df1"} Apr 20 20:56:31.844799 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.844223 2574 scope.go:117] "RemoveContainer" containerID="e12c9cde98df5f678b9120ed67dd095872d2e29249918edf242dc0dc36f288ec" Apr 20 20:56:31.844799 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.844188 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44" Apr 20 20:56:31.853020 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.853004 2574 scope.go:117] "RemoveContainer" containerID="ce2a69bba9741776fc944083a697dd756c7a47aa868f6de9295511b8d38169af" Apr 20 20:56:31.860056 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.860032 2574 scope.go:117] "RemoveContainer" containerID="1677ae91b9423f3f0c2416d72088d14bd8464244ca189f645b8affccf9c537e3" Apr 20 20:56:31.862948 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.862911 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44"] Apr 20 20:56:31.865525 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:31.865504 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-6wj44"] Apr 20 20:56:33.482779 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:33.482746 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" path="/var/lib/kubelet/pods/467ea2f4-7e8b-46de-8a70-99185fa72d83/volumes" Apr 20 20:56:40.657823 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.657793 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66"] Apr 20 20:56:40.658201 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.658126 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" Apr 20 20:56:40.658201 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.658140 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" Apr 20 20:56:40.658201 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.658158 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="storage-initializer" Apr 20 20:56:40.658201 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.658165 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="storage-initializer" Apr 20 20:56:40.658201 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.658171 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kserve-container" Apr 20 20:56:40.658201 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.658177 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kserve-container" Apr 20 20:56:40.658475 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.658238 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kserve-container" Apr 20 20:56:40.658475 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.658250 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="467ea2f4-7e8b-46de-8a70-99185fa72d83" containerName="kube-rbac-proxy" Apr 20 20:56:40.661623 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.661607 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.664043 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.664020 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:56:40.664043 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.664044 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:56:40.664253 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.664045 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 20 20:56:40.664253 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.664165 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 20 20:56:40.664802 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.664786 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:56:40.672065 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.672035 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66"] Apr 20 20:56:40.749728 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.749697 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b78a9633-9237-4047-b26e-d20d0d14b619-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.749876 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.749750 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78a9633-9237-4047-b26e-d20d0d14b619-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.749876 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.749778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j22jx\" (UniqueName: \"kubernetes.io/projected/b78a9633-9237-4047-b26e-d20d0d14b619-kube-api-access-j22jx\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.749876 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.749829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.850720 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.850686 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b78a9633-9237-4047-b26e-d20d0d14b619-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.850855 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.850742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78a9633-9237-4047-b26e-d20d0d14b619-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.850855 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.850772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j22jx\" (UniqueName: \"kubernetes.io/projected/b78a9633-9237-4047-b26e-d20d0d14b619-kube-api-access-j22jx\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.850855 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.850795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.850999 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:56:40.850911 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-triton-predictor-serving-cert: secret "isvc-triton-predictor-serving-cert" not found Apr 20 20:56:40.850999 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:56:40.850974 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls podName:b78a9633-9237-4047-b26e-d20d0d14b619 nodeName:}" failed. No retries permitted until 2026-04-20 20:56:41.350952928 +0000 UTC m=+3056.460008613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls") pod "isvc-triton-predictor-84bb65d94b-c9v66" (UID: "b78a9633-9237-4047-b26e-d20d0d14b619") : secret "isvc-triton-predictor-serving-cert" not found Apr 20 20:56:40.851181 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.851162 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78a9633-9237-4047-b26e-d20d0d14b619-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.851394 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.851374 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b78a9633-9237-4047-b26e-d20d0d14b619-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:40.859249 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:40.859227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j22jx\" (UniqueName: \"kubernetes.io/projected/b78a9633-9237-4047-b26e-d20d0d14b619-kube-api-access-j22jx\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:41.355318 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:41.355285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:41.355480 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:56:41.355366 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-triton-predictor-serving-cert: secret "isvc-triton-predictor-serving-cert" not found Apr 20 20:56:41.355480 ip-10-0-139-59 kubenswrapper[2574]: E0420 20:56:41.355438 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls podName:b78a9633-9237-4047-b26e-d20d0d14b619 nodeName:}" failed. No retries permitted until 2026-04-20 20:56:42.355420513 +0000 UTC m=+3057.464476198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls") pod "isvc-triton-predictor-84bb65d94b-c9v66" (UID: "b78a9633-9237-4047-b26e-d20d0d14b619") : secret "isvc-triton-predictor-serving-cert" not found Apr 20 20:56:42.364251 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:42.364209 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:42.366564 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:42.366547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-c9v66\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:42.472770 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:42.472738 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:56:42.591630 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:42.591587 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66"] Apr 20 20:56:42.594551 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:56:42.594522 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb78a9633_9237_4047_b26e_d20d0d14b619.slice/crio-226af4bc640913deb5b1dc630889c368d85023cc4901f56306e3431e61699cd0 WatchSource:0}: Error finding container 226af4bc640913deb5b1dc630889c368d85023cc4901f56306e3431e61699cd0: Status 404 returned error can't find the container with id 226af4bc640913deb5b1dc630889c368d85023cc4901f56306e3431e61699cd0 Apr 20 20:56:42.877633 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:42.877592 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" event={"ID":"b78a9633-9237-4047-b26e-d20d0d14b619","Type":"ContainerStarted","Data":"19af1a7289a385a016a78e4ab62f86d4d4eb3d182089210705c0f26b35066afd"} Apr 20 20:56:42.877633 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:42.877633 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" event={"ID":"b78a9633-9237-4047-b26e-d20d0d14b619","Type":"ContainerStarted","Data":"226af4bc640913deb5b1dc630889c368d85023cc4901f56306e3431e61699cd0"} Apr 20 20:56:46.889924 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:46.889843 2574 generic.go:358] "Generic (PLEG): container finished" podID="b78a9633-9237-4047-b26e-d20d0d14b619" containerID="19af1a7289a385a016a78e4ab62f86d4d4eb3d182089210705c0f26b35066afd" exitCode=0 Apr 20 20:56:46.890256 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:56:46.889921 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" event={"ID":"b78a9633-9237-4047-b26e-d20d0d14b619","Type":"ContainerDied","Data":"19af1a7289a385a016a78e4ab62f86d4d4eb3d182089210705c0f26b35066afd"} Apr 20 20:58:42.306427 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:42.306393 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" event={"ID":"b78a9633-9237-4047-b26e-d20d0d14b619","Type":"ContainerStarted","Data":"688ad108f57eaee27fbce5d265e4fe9c364d24ecb058ced9a45f47ba4efce3e1"} Apr 20 20:58:42.306427 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:42.306429 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" event={"ID":"b78a9633-9237-4047-b26e-d20d0d14b619","Type":"ContainerStarted","Data":"dbcaad0658fafb7da4862e2222702e4570dd1dc3bc58d13eca9f89ab80128d33"} Apr 20 20:58:42.306972 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:42.306639 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:58:42.306972 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:42.306733 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:58:42.308050 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:42.308028 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 20 20:58:42.333743 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:42.333695 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" podStartSLOduration=7.778363549 podStartE2EDuration="2m2.333684039s" podCreationTimestamp="2026-04-20 20:56:40 +0000 UTC" firstStartedPulling="2026-04-20 20:56:46.890980771 +0000 UTC m=+3062.000036460" lastFinishedPulling="2026-04-20 20:58:41.446301262 +0000 UTC m=+3176.555356950" observedRunningTime="2026-04-20 20:58:42.333046254 +0000 UTC m=+3177.442101960" watchObservedRunningTime="2026-04-20 20:58:42.333684039 +0000 UTC m=+3177.442739746" Apr 20 20:58:43.309187 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:43.309145 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 20 20:58:48.313143 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:48.313115 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:58:48.313950 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:48.313931 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:58:53.461864 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.461814 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66"] Apr 20 20:58:53.462869 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.462817 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kserve-container" containerID="cri-o://dbcaad0658fafb7da4862e2222702e4570dd1dc3bc58d13eca9f89ab80128d33" gracePeriod=30 Apr 20 20:58:53.463551 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.462897 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kube-rbac-proxy" containerID="cri-o://688ad108f57eaee27fbce5d265e4fe9c364d24ecb058ced9a45f47ba4efce3e1" gracePeriod=30 Apr 20 20:58:53.553169 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.553134 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr"] Apr 20 20:58:53.574615 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.574573 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr"] Apr 20 20:58:53.574764 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.574639 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.577218 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.577194 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 20 20:58:53.577218 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.577211 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 20 20:58:53.659430 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.659396 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5d42ea2-4c1a-4803-85c0-22c985ee505f-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.659430 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.659430 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.659631 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.659458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5d42ea2-4c1a-4803-85c0-22c985ee505f-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.659631 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.659492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxsfm\" (UniqueName: \"kubernetes.io/projected/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kube-api-access-cxsfm\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.760570 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.760484 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5d42ea2-4c1a-4803-85c0-22c985ee505f-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.760570 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.760523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.760811 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.760643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5d42ea2-4c1a-4803-85c0-22c985ee505f-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.760811 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.760697 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxsfm\" (UniqueName: \"kubernetes.io/projected/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kube-api-access-cxsfm\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.760931 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.760865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.761199 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.761180 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5d42ea2-4c1a-4803-85c0-22c985ee505f-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.763150 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.763128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5d42ea2-4c1a-4803-85c0-22c985ee505f-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.768189 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.768169 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxsfm\" (UniqueName: \"kubernetes.io/projected/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kube-api-access-cxsfm\") pod \"isvc-xgboost-predictor-8689c4cfcc-bqkjr\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:53.885245 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:53.885203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:58:54.143236 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:54.143192 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr"] Apr 20 20:58:54.146669 ip-10-0-139-59 kubenswrapper[2574]: W0420 20:58:54.146622 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5d42ea2_4c1a_4803_85c0_22c985ee505f.slice/crio-49c6edca96125035a6a0caff1f01d719433fc8ce6976028e5625704848343c98 WatchSource:0}: Error finding container 49c6edca96125035a6a0caff1f01d719433fc8ce6976028e5625704848343c98: Status 404 returned error can't find the container with id 49c6edca96125035a6a0caff1f01d719433fc8ce6976028e5625704848343c98 Apr 20 20:58:54.343772 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:54.343643 2574 generic.go:358] "Generic (PLEG): container finished" podID="b78a9633-9237-4047-b26e-d20d0d14b619" containerID="688ad108f57eaee27fbce5d265e4fe9c364d24ecb058ced9a45f47ba4efce3e1" exitCode=2 Apr 20 20:58:54.343772 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:54.343725 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" event={"ID":"b78a9633-9237-4047-b26e-d20d0d14b619","Type":"ContainerDied","Data":"688ad108f57eaee27fbce5d265e4fe9c364d24ecb058ced9a45f47ba4efce3e1"} Apr 20 20:58:54.345051 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:54.345030 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" event={"ID":"b5d42ea2-4c1a-4803-85c0-22c985ee505f","Type":"ContainerStarted","Data":"b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d"} Apr 20 20:58:54.345162 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:54.345055 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" event={"ID":"b5d42ea2-4c1a-4803-85c0-22c985ee505f","Type":"ContainerStarted","Data":"49c6edca96125035a6a0caff1f01d719433fc8ce6976028e5625704848343c98"} Apr 20 20:58:56.353210 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.353164 2574 generic.go:358] "Generic (PLEG): container finished" podID="b78a9633-9237-4047-b26e-d20d0d14b619" containerID="dbcaad0658fafb7da4862e2222702e4570dd1dc3bc58d13eca9f89ab80128d33" exitCode=0 Apr 20 20:58:56.353594 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.353220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" event={"ID":"b78a9633-9237-4047-b26e-d20d0d14b619","Type":"ContainerDied","Data":"dbcaad0658fafb7da4862e2222702e4570dd1dc3bc58d13eca9f89ab80128d33"} Apr 20 20:58:56.438585 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.438513 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:58:56.586777 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.586742 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78a9633-9237-4047-b26e-d20d0d14b619-kserve-provision-location\") pod \"b78a9633-9237-4047-b26e-d20d0d14b619\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " Apr 20 20:58:56.586949 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.586785 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls\") pod \"b78a9633-9237-4047-b26e-d20d0d14b619\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " Apr 20 20:58:56.586949 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.586826 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j22jx\" (UniqueName: \"kubernetes.io/projected/b78a9633-9237-4047-b26e-d20d0d14b619-kube-api-access-j22jx\") pod \"b78a9633-9237-4047-b26e-d20d0d14b619\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " Apr 20 20:58:56.586949 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.586877 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b78a9633-9237-4047-b26e-d20d0d14b619-isvc-triton-kube-rbac-proxy-sar-config\") pod \"b78a9633-9237-4047-b26e-d20d0d14b619\" (UID: \"b78a9633-9237-4047-b26e-d20d0d14b619\") " Apr 20 20:58:56.587244 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.587128 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b78a9633-9237-4047-b26e-d20d0d14b619-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b78a9633-9237-4047-b26e-d20d0d14b619" (UID: "b78a9633-9237-4047-b26e-d20d0d14b619"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:58:56.587381 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.587311 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b78a9633-9237-4047-b26e-d20d0d14b619-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "b78a9633-9237-4047-b26e-d20d0d14b619" (UID: "b78a9633-9237-4047-b26e-d20d0d14b619"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:58:56.589097 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.589075 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b78a9633-9237-4047-b26e-d20d0d14b619" (UID: "b78a9633-9237-4047-b26e-d20d0d14b619"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:58:56.589170 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.589122 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b78a9633-9237-4047-b26e-d20d0d14b619-kube-api-access-j22jx" (OuterVolumeSpecName: "kube-api-access-j22jx") pod "b78a9633-9237-4047-b26e-d20d0d14b619" (UID: "b78a9633-9237-4047-b26e-d20d0d14b619"). InnerVolumeSpecName "kube-api-access-j22jx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:58:56.687697 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.687658 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b78a9633-9237-4047-b26e-d20d0d14b619-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:58:56.687697 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.687693 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78a9633-9237-4047-b26e-d20d0d14b619-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:58:56.687697 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.687704 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a9633-9237-4047-b26e-d20d0d14b619-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:58:56.687919 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:56.687715 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j22jx\" (UniqueName: \"kubernetes.io/projected/b78a9633-9237-4047-b26e-d20d0d14b619-kube-api-access-j22jx\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 20:58:57.358003 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:57.357968 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" event={"ID":"b78a9633-9237-4047-b26e-d20d0d14b619","Type":"ContainerDied","Data":"226af4bc640913deb5b1dc630889c368d85023cc4901f56306e3431e61699cd0"} Apr 20 20:58:57.358420 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:57.358016 2574 scope.go:117] "RemoveContainer" containerID="688ad108f57eaee27fbce5d265e4fe9c364d24ecb058ced9a45f47ba4efce3e1" Apr 20 20:58:57.358420 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:57.358102 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66" Apr 20 20:58:57.366719 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:57.366698 2574 scope.go:117] "RemoveContainer" containerID="dbcaad0658fafb7da4862e2222702e4570dd1dc3bc58d13eca9f89ab80128d33" Apr 20 20:58:57.373675 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:57.373660 2574 scope.go:117] "RemoveContainer" containerID="19af1a7289a385a016a78e4ab62f86d4d4eb3d182089210705c0f26b35066afd" Apr 20 20:58:57.380156 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:57.380134 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66"] Apr 20 20:58:57.383731 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:57.383709 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-c9v66"] Apr 20 20:58:57.482842 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:57.482814 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" path="/var/lib/kubelet/pods/b78a9633-9237-4047-b26e-d20d0d14b619/volumes" Apr 20 20:58:58.362580 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:58.362550 2574 generic.go:358] "Generic (PLEG): container finished" podID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerID="b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d" exitCode=0 Apr 20 20:58:58.362982 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:58:58.362627 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" event={"ID":"b5d42ea2-4c1a-4803-85c0-22c985ee505f","Type":"ContainerDied","Data":"b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d"} Apr 20 20:59:19.440159 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:19.440123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" event={"ID":"b5d42ea2-4c1a-4803-85c0-22c985ee505f","Type":"ContainerStarted","Data":"a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db"} Apr 20 20:59:19.440159 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:19.440167 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" event={"ID":"b5d42ea2-4c1a-4803-85c0-22c985ee505f","Type":"ContainerStarted","Data":"92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7"} Apr 20 20:59:19.440655 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:19.440382 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:59:19.461950 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:19.461897 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podStartSLOduration=6.244729447 podStartE2EDuration="26.461879359s" podCreationTimestamp="2026-04-20 20:58:53 +0000 UTC" firstStartedPulling="2026-04-20 20:58:58.363944749 +0000 UTC m=+3193.473000437" lastFinishedPulling="2026-04-20 20:59:18.581094664 +0000 UTC m=+3213.690150349" observedRunningTime="2026-04-20 20:59:19.459664945 +0000 UTC m=+3214.568720652" watchObservedRunningTime="2026-04-20 20:59:19.461879359 +0000 UTC m=+3214.570935066" Apr 20 20:59:20.443279 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:20.443228 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:59:20.444392 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:20.444361 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 20 20:59:21.446070 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:21.446032 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 20 20:59:26.450870 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:26.450840 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 20:59:26.451460 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:26.451433 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 20 20:59:36.451737 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:36.451695 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 20 20:59:46.452225 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:46.452182 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 20 20:59:56.451571 ip-10-0-139-59 kubenswrapper[2574]: I0420 20:59:56.451534 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 20 21:00:06.452093 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:06.452055 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 20 21:00:16.452064 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:16.452032 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 21:00:23.665294 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:23.665064 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr"] Apr 20 21:00:23.666840 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:23.665506 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" containerID="cri-o://92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7" gracePeriod=30 Apr 20 21:00:23.666840 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:23.665587 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kube-rbac-proxy" containerID="cri-o://a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db" gracePeriod=30 Apr 20 21:00:24.634118 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:24.634084 2574 generic.go:358] "Generic (PLEG): container finished" podID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerID="a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db" exitCode=2 Apr 20 21:00:24.634312 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:24.634164 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" event={"ID":"b5d42ea2-4c1a-4803-85c0-22c985ee505f","Type":"ContainerDied","Data":"a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db"} Apr 20 21:00:26.446665 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:26.446625 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 20 21:00:26.451671 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:26.451648 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 20 21:00:27.049670 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.049641 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 21:00:27.051872 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.051858 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5d42ea2-4c1a-4803-85c0-22c985ee505f-proxy-tls\") pod \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " Apr 20 21:00:27.051930 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.051883 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxsfm\" (UniqueName: \"kubernetes.io/projected/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kube-api-access-cxsfm\") pod \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " Apr 20 21:00:27.051930 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.051920 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5d42ea2-4c1a-4803-85c0-22c985ee505f-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " Apr 20 21:00:27.051997 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.051938 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kserve-provision-location\") pod \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\" (UID: \"b5d42ea2-4c1a-4803-85c0-22c985ee505f\") " Apr 20 21:00:27.052314 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.052280 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5d42ea2-4c1a-4803-85c0-22c985ee505f-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "b5d42ea2-4c1a-4803-85c0-22c985ee505f" (UID: "b5d42ea2-4c1a-4803-85c0-22c985ee505f"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:00:27.052314 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.052304 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5d42ea2-4c1a-4803-85c0-22c985ee505f" (UID: "b5d42ea2-4c1a-4803-85c0-22c985ee505f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:00:27.054002 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.053977 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kube-api-access-cxsfm" (OuterVolumeSpecName: "kube-api-access-cxsfm") pod "b5d42ea2-4c1a-4803-85c0-22c985ee505f" (UID: "b5d42ea2-4c1a-4803-85c0-22c985ee505f"). InnerVolumeSpecName "kube-api-access-cxsfm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:00:27.054180 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.054164 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d42ea2-4c1a-4803-85c0-22c985ee505f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b5d42ea2-4c1a-4803-85c0-22c985ee505f" (UID: "b5d42ea2-4c1a-4803-85c0-22c985ee505f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:00:27.153111 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.153079 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5d42ea2-4c1a-4803-85c0-22c985ee505f-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:00:27.153111 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.153106 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxsfm\" (UniqueName: \"kubernetes.io/projected/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kube-api-access-cxsfm\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:00:27.153111 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.153116 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5d42ea2-4c1a-4803-85c0-22c985ee505f-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:00:27.153447 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.153126 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d42ea2-4c1a-4803-85c0-22c985ee505f-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:00:27.644949 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.644915 2574 generic.go:358] "Generic (PLEG): container finished" podID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerID="92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7" exitCode=0 Apr 20 21:00:27.645431 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.645005 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" Apr 20 21:00:27.645431 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.645003 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" event={"ID":"b5d42ea2-4c1a-4803-85c0-22c985ee505f","Type":"ContainerDied","Data":"92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7"} Apr 20 21:00:27.645431 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.645118 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr" event={"ID":"b5d42ea2-4c1a-4803-85c0-22c985ee505f","Type":"ContainerDied","Data":"49c6edca96125035a6a0caff1f01d719433fc8ce6976028e5625704848343c98"} Apr 20 21:00:27.645431 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.645133 2574 scope.go:117] "RemoveContainer" containerID="a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db" Apr 20 21:00:27.653211 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.653194 2574 scope.go:117] "RemoveContainer" containerID="92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7" Apr 20 21:00:27.663314 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.663291 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr"] Apr 20 21:00:27.663811 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.663796 2574 scope.go:117] "RemoveContainer" containerID="b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d" Apr 20 21:00:27.666680 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.666660 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-bqkjr"] Apr 20 21:00:27.670938 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.670925 2574 scope.go:117] "RemoveContainer" containerID="a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db" Apr 20 21:00:27.671174 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:00:27.671153 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db\": container with ID starting with a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db not found: ID does not exist" containerID="a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db" Apr 20 21:00:27.671241 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.671183 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db"} err="failed to get container status \"a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db\": rpc error: code = NotFound desc = could not find container \"a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db\": container with ID starting with a30ef052882ed5ccde512f559b105ef53259aa3e2a8d37e103ec72b082fb45db not found: ID does not exist" Apr 20 21:00:27.671241 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.671201 2574 scope.go:117] "RemoveContainer" containerID="92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7" Apr 20 21:00:27.671508 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:00:27.671491 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7\": container with ID starting with 92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7 not found: ID does not exist" containerID="92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7" Apr 20 21:00:27.671560 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.671514 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7"} err="failed to get container status \"92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7\": rpc error: code = NotFound desc = could not find container \"92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7\": container with ID starting with 92d827f6d887abe5b3e86061be3db61a9775179ec21aa0b8d52a7ba9471477d7 not found: ID does not exist" Apr 20 21:00:27.671560 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.671530 2574 scope.go:117] "RemoveContainer" containerID="b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d" Apr 20 21:00:27.671749 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:00:27.671733 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d\": container with ID starting with b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d not found: ID does not exist" containerID="b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d" Apr 20 21:00:27.671790 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:27.671754 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d"} err="failed to get container status \"b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d\": rpc error: code = NotFound desc = could not find container \"b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d\": container with ID starting with b238729a994011452873d8c21b137e8f0691be8de8c66d46af010901d24eb42d not found: ID does not exist" Apr 20 21:00:29.483427 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:29.483394 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" path="/var/lib/kubelet/pods/b5d42ea2-4c1a-4803-85c0-22c985ee505f/volumes" Apr 20 21:00:48.508638 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:48.508610 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 21:00:48.513411 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:48.513389 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 21:00:48.515526 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:48.515505 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 21:00:48.519544 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:00:48.519526 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 21:02:04.085538 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085503 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9"] Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085832 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kserve-container" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085842 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kserve-container" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085851 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kube-rbac-proxy" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085856 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kube-rbac-proxy" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085868 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="storage-initializer" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085874 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="storage-initializer" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085881 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085886 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085899 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="storage-initializer" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085905 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="storage-initializer" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085915 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kube-rbac-proxy" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085920 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kube-rbac-proxy" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085966 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kube-rbac-proxy" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085977 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5d42ea2-4c1a-4803-85c0-22c985ee505f" containerName="kserve-container" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085984 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kserve-container" Apr 20 21:02:04.086003 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.085990 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b78a9633-9237-4047-b26e-d20d0d14b619" containerName="kube-rbac-proxy" Apr 20 21:02:04.088253 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.088233 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.091165 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.091133 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 21:02:04.091339 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.091318 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 21:02:04.091536 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.091520 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 20 21:02:04.091593 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.091564 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 21:02:04.091685 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.091667 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 20 21:02:04.099295 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.099275 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9"] Apr 20 21:02:04.127631 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.127600 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.127774 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.127636 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvjdg\" (UniqueName: \"kubernetes.io/projected/c06bbd04-b229-4706-9ecc-517a50b28c39-kube-api-access-mvjdg\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.127774 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.127746 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c06bbd04-b229-4706-9ecc-517a50b28c39-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.127857 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.127798 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c06bbd04-b229-4706-9ecc-517a50b28c39-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.228592 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.228553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.228592 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.228594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvjdg\" (UniqueName: \"kubernetes.io/projected/c06bbd04-b229-4706-9ecc-517a50b28c39-kube-api-access-mvjdg\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.228859 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.228647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c06bbd04-b229-4706-9ecc-517a50b28c39-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.228859 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.228683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c06bbd04-b229-4706-9ecc-517a50b28c39-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.228859 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:02:04.228717 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-serving-cert: secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 20 21:02:04.228859 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:02:04.228797 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls podName:c06bbd04-b229-4706-9ecc-517a50b28c39 nodeName:}" failed. No retries permitted until 2026-04-20 21:02:04.728776066 +0000 UTC m=+3379.837831752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls") pod "isvc-xgboost-runtime-predictor-779db84d9-f7qk9" (UID: "c06bbd04-b229-4706-9ecc-517a50b28c39") : secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 20 21:02:04.229111 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.229089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c06bbd04-b229-4706-9ecc-517a50b28c39-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.229359 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.229337 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c06bbd04-b229-4706-9ecc-517a50b28c39-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.237222 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.237203 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvjdg\" (UniqueName: \"kubernetes.io/projected/c06bbd04-b229-4706-9ecc-517a50b28c39-kube-api-access-mvjdg\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.733081 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:04.733044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:04.733299 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:02:04.733190 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-serving-cert: secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 20 21:02:04.733299 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:02:04.733254 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls podName:c06bbd04-b229-4706-9ecc-517a50b28c39 nodeName:}" failed. No retries permitted until 2026-04-20 21:02:05.733238792 +0000 UTC m=+3380.842294476 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls") pod "isvc-xgboost-runtime-predictor-779db84d9-f7qk9" (UID: "c06bbd04-b229-4706-9ecc-517a50b28c39") : secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 20 21:02:05.742888 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:05.742851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:05.745435 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:05.745412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-f7qk9\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:05.898891 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:05.898846 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:06.022628 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:06.022532 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9"] Apr 20 21:02:06.025695 ip-10-0-139-59 kubenswrapper[2574]: W0420 21:02:06.025654 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc06bbd04_b229_4706_9ecc_517a50b28c39.slice/crio-3f9667fb7f57fa4bf840d98a512f5e4b264a1e90e734e91c2351fb0fcc9987ca WatchSource:0}: Error finding container 3f9667fb7f57fa4bf840d98a512f5e4b264a1e90e734e91c2351fb0fcc9987ca: Status 404 returned error can't find the container with id 3f9667fb7f57fa4bf840d98a512f5e4b264a1e90e734e91c2351fb0fcc9987ca Apr 20 21:02:06.027831 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:06.027814 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:02:06.955280 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:06.955224 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" event={"ID":"c06bbd04-b229-4706-9ecc-517a50b28c39","Type":"ContainerStarted","Data":"e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7"} Apr 20 21:02:06.955280 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:06.955276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" event={"ID":"c06bbd04-b229-4706-9ecc-517a50b28c39","Type":"ContainerStarted","Data":"3f9667fb7f57fa4bf840d98a512f5e4b264a1e90e734e91c2351fb0fcc9987ca"} Apr 20 21:02:10.968164 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:10.968125 2574 generic.go:358] "Generic (PLEG): container finished" podID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerID="e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7" exitCode=0 Apr 20 21:02:10.968601 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:10.968198 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" event={"ID":"c06bbd04-b229-4706-9ecc-517a50b28c39","Type":"ContainerDied","Data":"e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7"} Apr 20 21:02:11.973592 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:11.973558 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" event={"ID":"c06bbd04-b229-4706-9ecc-517a50b28c39","Type":"ContainerStarted","Data":"22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651"} Apr 20 21:02:11.973592 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:11.973593 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" event={"ID":"c06bbd04-b229-4706-9ecc-517a50b28c39","Type":"ContainerStarted","Data":"9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e"} Apr 20 21:02:11.974049 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:11.973879 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:11.974049 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:11.974026 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:11.975428 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:11.975402 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 20 21:02:11.992560 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:11.992504 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podStartSLOduration=7.992489053 podStartE2EDuration="7.992489053s" podCreationTimestamp="2026-04-20 21:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:02:11.990505428 +0000 UTC m=+3387.099561132" watchObservedRunningTime="2026-04-20 21:02:11.992489053 +0000 UTC m=+3387.101544760" Apr 20 21:02:12.976838 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:12.976792 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 20 21:02:17.981739 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:17.981706 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:02:17.982333 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:17.982305 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 20 21:02:27.982910 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:27.982867 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 20 21:02:37.982952 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:37.982904 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 20 21:02:47.983237 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:47.983190 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 20 21:02:57.982927 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:02:57.982882 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 20 21:03:07.983077 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:07.983029 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 20 21:03:17.983299 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:17.983251 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:03:24.171125 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:24.171092 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9"] Apr 20 21:03:24.171633 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:24.171460 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" containerID="cri-o://9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e" gracePeriod=30 Apr 20 21:03:24.171633 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:24.171521 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kube-rbac-proxy" containerID="cri-o://22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651" gracePeriod=30 Apr 20 21:03:25.198277 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:25.198235 2574 generic.go:358] "Generic (PLEG): container finished" podID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerID="22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651" exitCode=2 Apr 20 21:03:25.198665 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:25.198316 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" event={"ID":"c06bbd04-b229-4706-9ecc-517a50b28c39","Type":"ContainerDied","Data":"22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651"} Apr 20 21:03:28.019721 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.019697 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:03:28.163788 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.163744 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvjdg\" (UniqueName: \"kubernetes.io/projected/c06bbd04-b229-4706-9ecc-517a50b28c39-kube-api-access-mvjdg\") pod \"c06bbd04-b229-4706-9ecc-517a50b28c39\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " Apr 20 21:03:28.163970 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.163858 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c06bbd04-b229-4706-9ecc-517a50b28c39-kserve-provision-location\") pod \"c06bbd04-b229-4706-9ecc-517a50b28c39\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " Apr 20 21:03:28.163970 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.163890 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c06bbd04-b229-4706-9ecc-517a50b28c39-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"c06bbd04-b229-4706-9ecc-517a50b28c39\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " Apr 20 21:03:28.163970 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.163927 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls\") pod \"c06bbd04-b229-4706-9ecc-517a50b28c39\" (UID: \"c06bbd04-b229-4706-9ecc-517a50b28c39\") " Apr 20 21:03:28.164199 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.164173 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06bbd04-b229-4706-9ecc-517a50b28c39-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "c06bbd04-b229-4706-9ecc-517a50b28c39" (UID: "c06bbd04-b229-4706-9ecc-517a50b28c39"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:03:28.164350 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.164205 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c06bbd04-b229-4706-9ecc-517a50b28c39-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c06bbd04-b229-4706-9ecc-517a50b28c39" (UID: "c06bbd04-b229-4706-9ecc-517a50b28c39"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:03:28.166086 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.166064 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c06bbd04-b229-4706-9ecc-517a50b28c39" (UID: "c06bbd04-b229-4706-9ecc-517a50b28c39"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:03:28.166086 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.166066 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c06bbd04-b229-4706-9ecc-517a50b28c39-kube-api-access-mvjdg" (OuterVolumeSpecName: "kube-api-access-mvjdg") pod "c06bbd04-b229-4706-9ecc-517a50b28c39" (UID: "c06bbd04-b229-4706-9ecc-517a50b28c39"). InnerVolumeSpecName "kube-api-access-mvjdg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:03:28.208499 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.208463 2574 generic.go:358] "Generic (PLEG): container finished" podID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerID="9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e" exitCode=0 Apr 20 21:03:28.208659 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.208553 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" Apr 20 21:03:28.208659 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.208548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" event={"ID":"c06bbd04-b229-4706-9ecc-517a50b28c39","Type":"ContainerDied","Data":"9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e"} Apr 20 21:03:28.208733 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.208667 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" event={"ID":"c06bbd04-b229-4706-9ecc-517a50b28c39","Type":"ContainerDied","Data":"3f9667fb7f57fa4bf840d98a512f5e4b264a1e90e734e91c2351fb0fcc9987ca"} Apr 20 21:03:28.208733 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.208684 2574 scope.go:117] "RemoveContainer" containerID="22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651" Apr 20 21:03:28.221156 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.218971 2574 scope.go:117] "RemoveContainer" containerID="9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e" Apr 20 21:03:28.226968 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.226944 2574 scope.go:117] "RemoveContainer" containerID="e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7" Apr 20 21:03:28.232254 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.232230 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9"] Apr 20 21:03:28.235062 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.235039 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9"] Apr 20 21:03:28.235613 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.235595 2574 scope.go:117] "RemoveContainer" containerID="22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651" Apr 20 21:03:28.235872 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:03:28.235853 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651\": container with ID starting with 22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651 not found: ID does not exist" containerID="22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651" Apr 20 21:03:28.235921 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.235881 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651"} err="failed to get container status \"22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651\": rpc error: code = NotFound desc = could not find container \"22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651\": container with ID starting with 22df9d53262852a6382a0ccfdf2d2bfe56761a4b8463b4f5a0ff9e651d61b651 not found: ID does not exist" Apr 20 21:03:28.235921 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.235900 2574 scope.go:117] "RemoveContainer" containerID="9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e" Apr 20 21:03:28.236151 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:03:28.236137 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e\": container with ID starting with 9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e not found: ID does not exist" containerID="9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e" Apr 20 21:03:28.236212 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.236153 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e"} err="failed to get container status \"9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e\": rpc error: code = NotFound desc = could not find container \"9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e\": container with ID starting with 9a31813c7fa849f7fc9cbef8b41637b519c31715ac45924fac0e37a51d89706e not found: ID does not exist" Apr 20 21:03:28.236212 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.236165 2574 scope.go:117] "RemoveContainer" containerID="e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7" Apr 20 21:03:28.236401 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:03:28.236383 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7\": container with ID starting with e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7 not found: ID does not exist" containerID="e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7" Apr 20 21:03:28.236450 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.236410 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7"} err="failed to get container status \"e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7\": rpc error: code = NotFound desc = could not find container \"e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7\": container with ID starting with e4cd941dcc283359d16decd1b68b330fa0bc59fdf0264005b6c655b32eb58bc7 not found: ID does not exist" Apr 20 21:03:28.265552 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.265502 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c06bbd04-b229-4706-9ecc-517a50b28c39-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:03:28.265552 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.265544 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c06bbd04-b229-4706-9ecc-517a50b28c39-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:03:28.265552 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.265557 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c06bbd04-b229-4706-9ecc-517a50b28c39-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:03:28.265552 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.265568 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvjdg\" (UniqueName: \"kubernetes.io/projected/c06bbd04-b229-4706-9ecc-517a50b28c39-kube-api-access-mvjdg\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:03:28.976961 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.976909 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 20 21:03:28.983159 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:28.983129 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-f7qk9" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: i/o timeout" Apr 20 21:03:29.483867 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:03:29.483835 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" path="/var/lib/kubelet/pods/c06bbd04-b229-4706-9ecc-517a50b28c39/volumes" Apr 20 21:04:24.463734 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.463693 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w"] Apr 20 21:04:24.464300 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.464087 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="storage-initializer" Apr 20 21:04:24.464300 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.464101 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="storage-initializer" Apr 20 21:04:24.464300 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.464118 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kube-rbac-proxy" Apr 20 21:04:24.464300 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.464125 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kube-rbac-proxy" Apr 20 21:04:24.464300 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.464134 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" Apr 20 21:04:24.464300 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.464141 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" Apr 20 21:04:24.464300 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.464208 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kube-rbac-proxy" Apr 20 21:04:24.464300 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.464218 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c06bbd04-b229-4706-9ecc-517a50b28c39" containerName="kserve-container" Apr 20 21:04:24.467679 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.467661 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.470404 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.470371 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 21:04:24.470566 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.470437 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 20 21:04:24.470566 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.470541 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 21:04:24.471142 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.471128 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 21:04:24.471193 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.471154 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 20 21:04:24.475968 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.475945 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w"] Apr 20 21:04:24.547620 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.547587 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/396e1151-9dc2-4962-89dd-d0484302d6af-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.547821 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.547631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5tj9\" (UniqueName: \"kubernetes.io/projected/396e1151-9dc2-4962-89dd-d0484302d6af-kube-api-access-j5tj9\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.547821 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.547698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396e1151-9dc2-4962-89dd-d0484302d6af-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.547821 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.547756 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/396e1151-9dc2-4962-89dd-d0484302d6af-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.648460 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.648422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/396e1151-9dc2-4962-89dd-d0484302d6af-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.648460 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.648462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5tj9\" (UniqueName: \"kubernetes.io/projected/396e1151-9dc2-4962-89dd-d0484302d6af-kube-api-access-j5tj9\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.648729 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.648494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396e1151-9dc2-4962-89dd-d0484302d6af-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.648729 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.648529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/396e1151-9dc2-4962-89dd-d0484302d6af-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.649001 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.648971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396e1151-9dc2-4962-89dd-d0484302d6af-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.649188 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.649168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/396e1151-9dc2-4962-89dd-d0484302d6af-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.650986 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.650968 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/396e1151-9dc2-4962-89dd-d0484302d6af-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.656949 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.656922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5tj9\" (UniqueName: \"kubernetes.io/projected/396e1151-9dc2-4962-89dd-d0484302d6af-kube-api-access-j5tj9\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-bt76w\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.779199 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.779097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:24.908857 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:24.908794 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w"] Apr 20 21:04:24.912563 ip-10-0-139-59 kubenswrapper[2574]: W0420 21:04:24.912533 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod396e1151_9dc2_4962_89dd_d0484302d6af.slice/crio-afa5c891399525843b5e65bb33c27b4b35fd2c2e1bbb30232f72f11ab301421f WatchSource:0}: Error finding container afa5c891399525843b5e65bb33c27b4b35fd2c2e1bbb30232f72f11ab301421f: Status 404 returned error can't find the container with id afa5c891399525843b5e65bb33c27b4b35fd2c2e1bbb30232f72f11ab301421f Apr 20 21:04:25.384374 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:25.384337 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" event={"ID":"396e1151-9dc2-4962-89dd-d0484302d6af","Type":"ContainerStarted","Data":"b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6"} Apr 20 21:04:25.384374 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:25.384378 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" event={"ID":"396e1151-9dc2-4962-89dd-d0484302d6af","Type":"ContainerStarted","Data":"afa5c891399525843b5e65bb33c27b4b35fd2c2e1bbb30232f72f11ab301421f"} Apr 20 21:04:29.398477 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:29.398447 2574 generic.go:358] "Generic (PLEG): container finished" podID="396e1151-9dc2-4962-89dd-d0484302d6af" containerID="b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6" exitCode=0 Apr 20 21:04:29.398848 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:29.398517 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" event={"ID":"396e1151-9dc2-4962-89dd-d0484302d6af","Type":"ContainerDied","Data":"b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6"} Apr 20 21:04:30.403067 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:30.403031 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" event={"ID":"396e1151-9dc2-4962-89dd-d0484302d6af","Type":"ContainerStarted","Data":"5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772"} Apr 20 21:04:30.403572 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:30.403074 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" event={"ID":"396e1151-9dc2-4962-89dd-d0484302d6af","Type":"ContainerStarted","Data":"febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc"} Apr 20 21:04:30.403572 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:30.403299 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:30.420958 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:30.420912 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podStartSLOduration=6.420897122 podStartE2EDuration="6.420897122s" podCreationTimestamp="2026-04-20 21:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:04:30.419980938 +0000 UTC m=+3525.529036646" watchObservedRunningTime="2026-04-20 21:04:30.420897122 +0000 UTC m=+3525.529952831" Apr 20 21:04:31.406683 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:31.406656 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:31.407900 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:31.407869 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 20 21:04:32.410360 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:32.410323 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 20 21:04:37.415195 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:37.415158 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:04:37.415730 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:37.415704 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 20 21:04:47.415837 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:47.415798 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 20 21:04:57.416681 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:04:57.416635 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 20 21:05:07.415966 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:07.415928 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 20 21:05:17.415728 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:17.415683 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 20 21:05:27.415804 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:27.415763 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 20 21:05:37.416553 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:37.416520 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:05:44.532198 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:44.532160 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w"] Apr 20 21:05:44.532692 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:44.532600 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" containerID="cri-o://febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc" gracePeriod=30 Apr 20 21:05:44.532766 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:44.532661 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kube-rbac-proxy" containerID="cri-o://5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772" gracePeriod=30 Apr 20 21:05:45.633039 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:45.633006 2574 generic.go:358] "Generic (PLEG): container finished" podID="396e1151-9dc2-4962-89dd-d0484302d6af" containerID="5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772" exitCode=2 Apr 20 21:05:45.633432 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:45.633084 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" event={"ID":"396e1151-9dc2-4962-89dd-d0484302d6af","Type":"ContainerDied","Data":"5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772"} Apr 20 21:05:47.411487 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:47.411441 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.41:8643/healthz\": dial tcp 10.134.0.41:8643: connect: connection refused" Apr 20 21:05:47.415820 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:47.415788 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 20 21:05:48.279858 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.279835 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:05:48.377336 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.377221 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/396e1151-9dc2-4962-89dd-d0484302d6af-proxy-tls\") pod \"396e1151-9dc2-4962-89dd-d0484302d6af\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " Apr 20 21:05:48.377336 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.377323 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/396e1151-9dc2-4962-89dd-d0484302d6af-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"396e1151-9dc2-4962-89dd-d0484302d6af\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " Apr 20 21:05:48.377612 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.377419 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396e1151-9dc2-4962-89dd-d0484302d6af-kserve-provision-location\") pod \"396e1151-9dc2-4962-89dd-d0484302d6af\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " Apr 20 21:05:48.377612 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.377468 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5tj9\" (UniqueName: \"kubernetes.io/projected/396e1151-9dc2-4962-89dd-d0484302d6af-kube-api-access-j5tj9\") pod \"396e1151-9dc2-4962-89dd-d0484302d6af\" (UID: \"396e1151-9dc2-4962-89dd-d0484302d6af\") " Apr 20 21:05:48.377728 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.377671 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/396e1151-9dc2-4962-89dd-d0484302d6af-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "396e1151-9dc2-4962-89dd-d0484302d6af" (UID: "396e1151-9dc2-4962-89dd-d0484302d6af"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:05:48.377728 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.377703 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396e1151-9dc2-4962-89dd-d0484302d6af-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "396e1151-9dc2-4962-89dd-d0484302d6af" (UID: "396e1151-9dc2-4962-89dd-d0484302d6af"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:05:48.379680 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.379651 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396e1151-9dc2-4962-89dd-d0484302d6af-kube-api-access-j5tj9" (OuterVolumeSpecName: "kube-api-access-j5tj9") pod "396e1151-9dc2-4962-89dd-d0484302d6af" (UID: "396e1151-9dc2-4962-89dd-d0484302d6af"). InnerVolumeSpecName "kube-api-access-j5tj9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:05:48.379767 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.379658 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396e1151-9dc2-4962-89dd-d0484302d6af-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "396e1151-9dc2-4962-89dd-d0484302d6af" (UID: "396e1151-9dc2-4962-89dd-d0484302d6af"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:05:48.478654 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.478617 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/396e1151-9dc2-4962-89dd-d0484302d6af-proxy-tls\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:05:48.478654 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.478650 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/396e1151-9dc2-4962-89dd-d0484302d6af-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:05:48.478654 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.478661 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396e1151-9dc2-4962-89dd-d0484302d6af-kserve-provision-location\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:05:48.479111 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.478671 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j5tj9\" (UniqueName: \"kubernetes.io/projected/396e1151-9dc2-4962-89dd-d0484302d6af-kube-api-access-j5tj9\") on node \"ip-10-0-139-59.ec2.internal\" DevicePath \"\"" Apr 20 21:05:48.533390 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.533351 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 21:05:48.538191 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.538167 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 21:05:48.539937 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.539916 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 21:05:48.544412 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.544395 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 21:05:48.644108 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.644024 2574 generic.go:358] "Generic (PLEG): container finished" podID="396e1151-9dc2-4962-89dd-d0484302d6af" containerID="febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc" exitCode=0 Apr 20 21:05:48.644332 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.644097 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" event={"ID":"396e1151-9dc2-4962-89dd-d0484302d6af","Type":"ContainerDied","Data":"febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc"} Apr 20 21:05:48.644332 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.644143 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" event={"ID":"396e1151-9dc2-4962-89dd-d0484302d6af","Type":"ContainerDied","Data":"afa5c891399525843b5e65bb33c27b4b35fd2c2e1bbb30232f72f11ab301421f"} Apr 20 21:05:48.644332 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.644160 2574 scope.go:117] "RemoveContainer" containerID="5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772" Apr 20 21:05:48.644332 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.644113 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w" Apr 20 21:05:48.653427 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.653407 2574 scope.go:117] "RemoveContainer" containerID="febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc" Apr 20 21:05:48.663378 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.663349 2574 scope.go:117] "RemoveContainer" containerID="b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6" Apr 20 21:05:48.665799 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.665776 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w"] Apr 20 21:05:48.669586 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.669563 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-bt76w"] Apr 20 21:05:48.672331 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.672315 2574 scope.go:117] "RemoveContainer" containerID="5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772" Apr 20 21:05:48.672600 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:05:48.672582 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772\": container with ID starting with 5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772 not found: ID does not exist" containerID="5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772" Apr 20 21:05:48.672649 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.672608 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772"} err="failed to get container status \"5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772\": rpc error: code = NotFound desc = could not find container \"5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772\": container with ID starting with 5ad3b1343acbba95c8436a32ac0687dcf1b6c6fcb5fdcf3075cbd3a786331772 not found: ID does not exist" Apr 20 21:05:48.672649 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.672627 2574 scope.go:117] "RemoveContainer" containerID="febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc" Apr 20 21:05:48.672840 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:05:48.672825 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc\": container with ID starting with febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc not found: ID does not exist" containerID="febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc" Apr 20 21:05:48.672879 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.672844 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc"} err="failed to get container status \"febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc\": rpc error: code = NotFound desc = could not find container \"febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc\": container with ID starting with febcfb5cd9c9a6f8d6af3414d67973deecd41b03329ab0c3c308375117c1e1bc not found: ID does not exist" Apr 20 21:05:48.672879 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.672857 2574 scope.go:117] "RemoveContainer" containerID="b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6" Apr 20 21:05:48.673074 ip-10-0-139-59 kubenswrapper[2574]: E0420 21:05:48.673058 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6\": container with ID starting with b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6 not found: ID does not exist" containerID="b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6" Apr 20 21:05:48.673120 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:48.673075 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6"} err="failed to get container status \"b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6\": rpc error: code = NotFound desc = could not find container \"b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6\": container with ID starting with b4ee99383c288d3af3af6ec190b59eed3ab403f6314553ff4232c07cd4b21af6 not found: ID does not exist" Apr 20 21:05:49.483026 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:05:49.482986 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" path="/var/lib/kubelet/pods/396e1151-9dc2-4962-89dd-d0484302d6af/volumes" Apr 20 21:10:48.556116 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:10:48.556087 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 21:10:48.562518 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:10:48.562494 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 21:10:48.563019 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:10:48.562998 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 21:10:48.568291 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:10:48.568255 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 21:12:02.874275 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:02.874221 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kfprs_82247d4d-96be-48e7-adfa-641c9ae39250/global-pull-secret-syncer/0.log" Apr 20 21:12:02.932499 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:02.932474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2chhh_3e016fd4-9b61-4ff9-bcfa-b119516354d0/konnectivity-agent/0.log" Apr 20 21:12:03.025960 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:03.025924 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-59.ec2.internal_ba01067d8d4b51ceee17bbe157f40932/haproxy/0.log" Apr 20 21:12:06.248774 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.248745 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_efc7d93f-75ee-4556-990b-07aebff23d49/alertmanager/0.log" Apr 20 21:12:06.270926 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.270905 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_efc7d93f-75ee-4556-990b-07aebff23d49/config-reloader/0.log" Apr 20 21:12:06.292788 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.292762 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_efc7d93f-75ee-4556-990b-07aebff23d49/kube-rbac-proxy-web/0.log" Apr 20 21:12:06.316664 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.316635 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_efc7d93f-75ee-4556-990b-07aebff23d49/kube-rbac-proxy/0.log" Apr 20 21:12:06.341868 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.341841 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_efc7d93f-75ee-4556-990b-07aebff23d49/kube-rbac-proxy-metric/0.log" Apr 20 21:12:06.364943 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.364918 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_efc7d93f-75ee-4556-990b-07aebff23d49/prom-label-proxy/0.log" Apr 20 21:12:06.386215 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.386189 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_efc7d93f-75ee-4556-990b-07aebff23d49/init-config-reloader/0.log" Apr 20 21:12:06.428467 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.428437 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-w5vnd_05d00a46-2c87-449e-b9c2-6274c763b555/cluster-monitoring-operator/0.log" Apr 20 21:12:06.458749 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.458685 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tjkrs_78330f86-de48-4277-bc19-bbb785bcaddc/kube-state-metrics/0.log" Apr 20 21:12:06.484146 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.484119 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tjkrs_78330f86-de48-4277-bc19-bbb785bcaddc/kube-rbac-proxy-main/0.log" Apr 20 21:12:06.505399 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.505376 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tjkrs_78330f86-de48-4277-bc19-bbb785bcaddc/kube-rbac-proxy-self/0.log" Apr 20 21:12:06.750768 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.750686 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-stbdv_6d7f61cc-34e8-4462-a4cb-828621ebb984/node-exporter/0.log" Apr 20 21:12:06.773144 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.773113 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-stbdv_6d7f61cc-34e8-4462-a4cb-828621ebb984/kube-rbac-proxy/0.log" Apr 20 21:12:06.798119 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.798090 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-stbdv_6d7f61cc-34e8-4462-a4cb-828621ebb984/init-textfile/0.log" Apr 20 21:12:06.826155 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.826132 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vn6pg_b936bdf7-2a5b-40cc-94f4-ef8998a14ad0/kube-rbac-proxy-main/0.log" Apr 20 21:12:06.849708 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.849684 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vn6pg_b936bdf7-2a5b-40cc-94f4-ef8998a14ad0/kube-rbac-proxy-self/0.log" Apr 20 21:12:06.876893 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:06.876859 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vn6pg_b936bdf7-2a5b-40cc-94f4-ef8998a14ad0/openshift-state-metrics/0.log" Apr 20 21:12:07.169015 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:07.168986 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5897f45db7-csbkg_8214ca68-6496-4e90-ad00-c38797eedbf1/telemeter-client/0.log" Apr 20 21:12:07.195144 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:07.195116 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5897f45db7-csbkg_8214ca68-6496-4e90-ad00-c38797eedbf1/reload/0.log" Apr 20 21:12:07.221973 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:07.221950 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5897f45db7-csbkg_8214ca68-6496-4e90-ad00-c38797eedbf1/kube-rbac-proxy/0.log" Apr 20 21:12:09.092438 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.092367 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/1.log" Apr 20 21:12:09.097143 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.097125 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-blpbt_5bf2bb25-bfba-4f17-b4fc-7607da4bb789/console-operator/2.log" Apr 20 21:12:09.522718 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.522688 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-q8x64_bba62270-eaf0-456e-8beb-0c7e16c16c44/download-server/0.log" Apr 20 21:12:09.715616 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.715584 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m"] Apr 20 21:12:09.715913 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.715901 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" Apr 20 21:12:09.715962 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.715916 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" Apr 20 21:12:09.715962 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.715927 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kube-rbac-proxy" Apr 20 21:12:09.715962 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.715933 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kube-rbac-proxy" Apr 20 21:12:09.715962 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.715951 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="storage-initializer" Apr 20 21:12:09.715962 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.715957 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="storage-initializer" Apr 20 21:12:09.716128 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.716004 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kube-rbac-proxy" Apr 20 21:12:09.716128 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.716021 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="396e1151-9dc2-4962-89dd-d0484302d6af" containerName="kserve-container" Apr 20 21:12:09.719046 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.719026 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.721455 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.721434 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qsjk5\"/\"default-dockercfg-nn688\"" Apr 20 21:12:09.722217 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.722199 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qsjk5\"/\"openshift-service-ca.crt\"" Apr 20 21:12:09.722334 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.722217 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qsjk5\"/\"kube-root-ca.crt\"" Apr 20 21:12:09.731621 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.731600 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m"] Apr 20 21:12:09.776242 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.776152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-sys\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.776242 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.776188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-lib-modules\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.776242 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.776209 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8r2\" (UniqueName: \"kubernetes.io/projected/8228a8a9-2cfb-44df-a45a-a4f014a892a6-kube-api-access-8b8r2\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.776470 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.776281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-proc\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.776470 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.776298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-podres\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.877247 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.877203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-sys\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.877247 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.877250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-lib-modules\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.877507 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.877291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8r2\" (UniqueName: \"kubernetes.io/projected/8228a8a9-2cfb-44df-a45a-a4f014a892a6-kube-api-access-8b8r2\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.877507 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.877328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-proc\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.877507 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.877336 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-sys\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.877507 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.877349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-podres\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.877507 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.877419 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-proc\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.877507 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.877438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-podres\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.877507 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.877451 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8228a8a9-2cfb-44df-a45a-a4f014a892a6-lib-modules\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.887240 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.887211 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8r2\" (UniqueName: \"kubernetes.io/projected/8228a8a9-2cfb-44df-a45a-a4f014a892a6-kube-api-access-8b8r2\") pod \"perf-node-gather-daemonset-h9v8m\" (UID: \"8228a8a9-2cfb-44df-a45a-a4f014a892a6\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:09.964248 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:09.964215 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-gv5n7_5e691ef5-adc1-46b2-bcdb-be0d299cad21/volume-data-source-validator/0.log" Apr 20 21:12:10.028806 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.028728 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:10.148006 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.147978 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m"] Apr 20 21:12:10.150253 ip-10-0-139-59 kubenswrapper[2574]: W0420 21:12:10.150226 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8228a8a9_2cfb_44df_a45a_a4f014a892a6.slice/crio-bdd1e4ec4e356d27b9cc5ddfbf8f7f0006065a81d409ab33d1bcef2777cfeea5 WatchSource:0}: Error finding container bdd1e4ec4e356d27b9cc5ddfbf8f7f0006065a81d409ab33d1bcef2777cfeea5: Status 404 returned error can't find the container with id bdd1e4ec4e356d27b9cc5ddfbf8f7f0006065a81d409ab33d1bcef2777cfeea5 Apr 20 21:12:10.151808 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.151790 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:12:10.648597 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.648548 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jxlts_a53459d8-2c1c-4399-801a-d69f56977702/dns/0.log" Apr 20 21:12:10.677141 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.677117 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jxlts_a53459d8-2c1c-4399-801a-d69f56977702/kube-rbac-proxy/0.log" Apr 20 21:12:10.767089 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.767054 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" event={"ID":"8228a8a9-2cfb-44df-a45a-a4f014a892a6","Type":"ContainerStarted","Data":"0acfb3cbf2e738b7d4ddea186321e28375a6a424dd68baf782c28392194dfb5e"} Apr 20 21:12:10.767089 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.767091 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" event={"ID":"8228a8a9-2cfb-44df-a45a-a4f014a892a6","Type":"ContainerStarted","Data":"bdd1e4ec4e356d27b9cc5ddfbf8f7f0006065a81d409ab33d1bcef2777cfeea5"} Apr 20 21:12:10.767322 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.767124 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:10.788172 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.788001 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" podStartSLOduration=1.7879866930000001 podStartE2EDuration="1.787986693s" podCreationTimestamp="2026-04-20 21:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:12:10.787462015 +0000 UTC m=+3985.896517722" watchObservedRunningTime="2026-04-20 21:12:10.787986693 +0000 UTC m=+3985.897042400" Apr 20 21:12:10.861033 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:10.861002 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mt4p7_0d4eff69-9408-40ee-8a0b-0bbf888c3b7d/dns-node-resolver/0.log" Apr 20 21:12:11.325962 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:11.325929 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6fb58fbcf4-lg4j5_994894c1-acec-4952-8ffb-c3f8e2741554/registry/0.log" Apr 20 21:12:11.382583 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:11.382558 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lwzcq_801646f7-e8a7-4096-8154-5d74d8e4778d/node-ca/0.log" Apr 20 21:12:12.088534 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:12.088508 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-57f8d89fbb-xxxkj_e7689c3e-44d1-4487-8fd4-e0ab1160ca1d/router/0.log" Apr 20 21:12:12.398426 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:12.398395 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-q92l2_d2fc277b-80e6-4be4-a366-c742f661aa43/serve-healthcheck-canary/0.log" Apr 20 21:12:12.779584 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:12.779503 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-7pjcj_48d8dc00-2133-4e50-9e06-45cb14a568c8/insights-operator/1.log" Apr 20 21:12:12.779761 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:12.779702 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-7pjcj_48d8dc00-2133-4e50-9e06-45cb14a568c8/insights-operator/0.log" Apr 20 21:12:12.878343 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:12.878314 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9v4hs_0307fec1-4aab-49c3-8677-c695c3f01c6a/kube-rbac-proxy/0.log" Apr 20 21:12:12.915188 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:12.915162 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9v4hs_0307fec1-4aab-49c3-8677-c695c3f01c6a/exporter/0.log" Apr 20 21:12:12.942230 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:12.942202 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9v4hs_0307fec1-4aab-49c3-8677-c695c3f01c6a/extractor/0.log" Apr 20 21:12:14.961608 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:14.961578 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6f655776dd-d9cwf_fcacfdd2-50a1-4d88-b72f-c1de2da9cad6/manager/0.log" Apr 20 21:12:14.987895 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:14.987871 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-8f9fc_6fef7ec2-36fc-4184-b7f1-0b2c5cbbc91f/manager/0.log" Apr 20 21:12:15.296496 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:15.296418 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-659fp_01fd8c5f-b99c-42a6-b588-dd208b4c22de/s3-tls-init-serving/0.log" Apr 20 21:12:16.780803 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:16.780775 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-h9v8m" Apr 20 21:12:20.637484 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:20.637452 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqc62_0389bcb6-aeb1-4436-a30f-8c26b9b175a1/kube-multus-additional-cni-plugins/0.log" Apr 20 21:12:20.658731 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:20.658706 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqc62_0389bcb6-aeb1-4436-a30f-8c26b9b175a1/egress-router-binary-copy/0.log" Apr 20 21:12:20.683277 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:20.683240 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqc62_0389bcb6-aeb1-4436-a30f-8c26b9b175a1/cni-plugins/0.log" Apr 20 21:12:20.722603 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:20.722576 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqc62_0389bcb6-aeb1-4436-a30f-8c26b9b175a1/bond-cni-plugin/0.log" Apr 20 21:12:20.742962 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:20.742946 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqc62_0389bcb6-aeb1-4436-a30f-8c26b9b175a1/routeoverride-cni/0.log" Apr 20 21:12:20.763613 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:20.763588 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqc62_0389bcb6-aeb1-4436-a30f-8c26b9b175a1/whereabouts-cni-bincopy/0.log" Apr 20 21:12:20.783159 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:20.783139 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqc62_0389bcb6-aeb1-4436-a30f-8c26b9b175a1/whereabouts-cni/0.log" Apr 20 21:12:20.977922 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:20.977898 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kgzdk_7e32072c-26b6-4466-b63a-3602df3f45f5/kube-multus/0.log" Apr 20 21:12:21.083477 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:21.083445 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kzfxf_89492d29-88c3-44e3-adc2-eda0304a1081/network-metrics-daemon/0.log" Apr 20 21:12:21.107767 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:21.107729 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kzfxf_89492d29-88c3-44e3-adc2-eda0304a1081/kube-rbac-proxy/0.log" Apr 20 21:12:22.122855 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:22.122822 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-controller/0.log" Apr 20 21:12:22.140618 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:22.140596 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/0.log" Apr 20 21:12:22.157538 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:22.157519 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovn-acl-logging/1.log" Apr 20 21:12:22.182457 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:22.182435 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/kube-rbac-proxy-node/0.log" Apr 20 21:12:22.204447 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:22.204431 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 21:12:22.221590 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:22.221573 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/northd/0.log" Apr 20 21:12:22.242044 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:22.242025 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/nbdb/0.log" Apr 20 21:12:22.265528 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:22.265506 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/sbdb/0.log" Apr 20 21:12:22.367570 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:22.367543 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fzv6_5beb43cd-4e2d-436b-baba-c0a6aac03ff2/ovnkube-controller/0.log" Apr 20 21:12:23.699068 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:23.699038 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-j74qz_cfcf938f-bded-4945-9620-473df1d1b448/network-check-target-container/0.log" Apr 20 21:12:24.659209 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:24.659181 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-t2pv6_a364d155-96d2-457b-8944-52e7438d87e8/iptables-alerter/0.log" Apr 20 21:12:25.276375 ip-10-0-139-59 kubenswrapper[2574]: I0420 21:12:25.276346 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9dtpd_cf736813-f1cd-49b9-8595-27549c6fa9cb/tuned/0.log"