Apr 24 21:26:54.579235 ip-10-0-136-201 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.087457 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090647 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090659 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090662 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090666 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090669 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090672 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090675 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:55.275733 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090678 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:55.204873 ip-10-0-136-201 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090681 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090684 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090687 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090689 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090693 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090696 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090699 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090702 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090704 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090707 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090710 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090712 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090715 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090717 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090720 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090723 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090726 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090729 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090731 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:55.290838 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090735 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090737 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090740 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090743 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090746 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090749 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090752 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090755 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090758 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090760 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090763 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090766 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090768 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090771 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090773 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090776 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090778 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090781 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090784 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:55.291695 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090786 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090789 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090791 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090794 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090796 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090801 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090804 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090807 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090810 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090813 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090816 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090818 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090821 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090825 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090829 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090831 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090834 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090837 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090840 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:55.292372 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090842 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090845 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090848 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090853 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090856 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090859 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090862 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090865 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090867 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090870 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090872 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090875 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090877 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090880 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090882 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090885 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090888 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090890 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090893 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090896 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:55.293085 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.090898 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092446 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092458 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092462 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092466 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092469 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092477 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092480 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092483 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092486 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092489 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092492 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092495 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092498 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092502 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092505 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092508 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092511 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092517 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092520 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:55.293692 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092522 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092525 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092528 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092531 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092534 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092537 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092540 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092543 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092546 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092549 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092554 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092557 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092560 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092563 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092566 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092568 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092571 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092574 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092577 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:55.294577 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092580 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092582 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092585 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092588 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092593 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092596 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092598 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092601 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092604 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092608 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092611 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092614 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092616 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092619 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092622 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092626 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092631 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092634 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092637 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092640 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:55.295424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092642 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092645 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092648 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092650 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092653 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092656 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092658 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092663 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092666 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092672 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092678 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092681 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092684 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092687 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092691 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092695 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092698 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092701 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092704 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:55.296353 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092706 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092712 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092715 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092718 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092721 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092724 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092727 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092730 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.092732 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095134 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095151 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095160 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095165 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095170 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095173 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095178 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095183 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095186 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095189 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095193 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095196 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095199 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:26:55.297034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095202 2573 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095205 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095208 2573 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095211 2573 flags.go:64] FLAG: --cloud-config="" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095214 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095217 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095226 2573 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095229 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095232 2573 flags.go:64] FLAG: --config-dir="" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095235 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095239 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095243 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095247 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095251 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095255 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095258 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095261 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095265 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095268 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095271 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095276 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095279 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095282 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095284 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095287 2573 flags.go:64] FLAG: --enable-server="true" Apr 24 21:26:55.297860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095290 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095297 2573 flags.go:64] FLAG: --event-burst="100" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095300 2573 flags.go:64] FLAG: --event-qps="50" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095304 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095307 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095310 2573 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095314 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095317 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095320 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095323 2573 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095326 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095328 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095332 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095335 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095337 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095340 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095343 2573 flags.go:64] FLAG: --feature-gates="" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095347 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095350 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095353 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095358 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095361 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095364 2573 flags.go:64] FLAG: --help="false" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095367 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095371 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:26:55.298613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095374 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095377 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095381 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095385 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095388 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095391 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095394 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095397 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095400 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095404 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095407 2573 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095410 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095412 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095416 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095418 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095421 2573 flags.go:64] FLAG: --lock-file="" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095424 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095427 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095430 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095435 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095438 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095441 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095444 2573 flags.go:64] FLAG: --logging-format="text" Apr 24 21:26:55.300761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095447 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095450 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095453 2573 flags.go:64] FLAG: --manifest-url="" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095456 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095462 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095466 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095470 2573 flags.go:64] FLAG: --max-pods="110" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095473 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095476 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095479 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095482 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095485 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095488 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095491 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095500 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095503 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095507 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095510 2573 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095513 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095518 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095521 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095524 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095527 2573 flags.go:64] FLAG: --port="10250" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095533 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:26:55.301785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095536 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e281e5012c9b9327" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095539 2573 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095542 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095545 2573 flags.go:64] FLAG: --register-node="true" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095549 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095552 2573 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095556 2573 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095559 2573 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095562 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095565 2573 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095569 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095572 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095575 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095578 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095581 2573 flags.go:64] FLAG: --runonce="false" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095584 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095587 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095590 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095593 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095596 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095599 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095602 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095605 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095608 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095611 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095613 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:26:55.503174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095616 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095620 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095623 2573 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095626 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095631 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095635 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095638 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095645 2573 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095648 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095651 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095654 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095657 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095660 2573 flags.go:64] FLAG: --v="2" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095664 2573 flags.go:64] FLAG: --version="false" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095668 2573 flags.go:64] FLAG: --vmodule="" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095673 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.095676 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095771 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095775 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095778 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095781 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095792 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095795 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:55.504410 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095798 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095801 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095803 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095806 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095809 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095812 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095818 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095821 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095823 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095826 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095829 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095832 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095834 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095837 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095841 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095844 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095846 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095849 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095851 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095854 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:55.506573 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095857 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095859 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095862 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095864 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095867 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095869 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095872 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095875 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095877 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095880 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095882 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095891 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095894 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095897 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095899 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095902 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095904 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095907 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095911 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095913 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:55.507255 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095934 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095962 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095974 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095979 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095983 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095986 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095989 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095992 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095995 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.095998 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096001 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096004 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096006 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096009 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096012 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096015 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096018 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096025 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096029 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096032 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:55.507863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096035 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096038 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096040 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096043 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096056 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096059 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096061 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096064 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096067 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096070 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096072 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096075 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096077 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096080 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096084 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096089 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096092 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096095 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096097 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:55.508737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.096101 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.096722 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.103141 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.103156 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103202 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103206 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103210 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103216 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103219 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103222 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103225 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103229 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103233 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103237 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103240 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:55.509310 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103243 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103246 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103250 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103252 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103255 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103258 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103261 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103264 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103266 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103269 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103272 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103275 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103277 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103280 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103282 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103285 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103289 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103292 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103294 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103297 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:55.509757 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103300 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103303 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103306 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103308 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103312 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103315 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103318 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103320 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103323 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103325 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103328 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103331 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103334 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103337 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103339 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103342 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103345 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103348 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103350 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103353 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:55.510406 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103355 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103373 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103377 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103380 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103383 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103385 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103388 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103391 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103395 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103397 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103400 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103402 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103406 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103411 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103414 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103416 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103420 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103423 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103426 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103428 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:55.510983 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103431 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103433 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103436 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103438 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103441 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103443 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103446 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103449 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103452 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103454 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103457 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103459 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103462 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103464 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103467 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.103472 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:55.511724 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103566 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103570 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103573 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103576 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103579 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103583 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103586 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103589 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103592 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103594 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103598 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103600 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103606 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103609 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103612 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103614 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103617 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103619 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103622 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103624 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:55.736943 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103627 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103629 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103632 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103635 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103637 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103639 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103642 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103645 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103647 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103650 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103653 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103655 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103658 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103660 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103663 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103666 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103668 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103672 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103674 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103677 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:55.739717 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103680 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103683 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103685 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103689 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103693 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103695 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103698 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103701 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103704 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103706 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103709 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103711 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103714 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103717 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103719 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103722 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103725 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103728 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103730 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103733 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:55.740578 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103735 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103738 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103741 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103743 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103746 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103749 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103751 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103753 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103756 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103760 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103762 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103765 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103767 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103770 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103772 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103775 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103780 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103783 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103787 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:55.974495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103789 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103792 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103795 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103797 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103801 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103805 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:55.103807 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.103812 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.103953 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.108706 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.109769 2573 server.go:1019] "Starting client certificate rotation" Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.109889 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.109936 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:55.975756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.137727 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:55.976149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.141539 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:55.976149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.158775 2573 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:26:55.976149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.164961 2573 log.go:25] "Validated CRI v1 image API" Apr 24 21:26:55.976149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.166368 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:26:55.976149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.173133 2573 fs.go:135] Filesystem UUIDs: map[1b01a4f5-b87d-4e93-91db-2613c047f55d:/dev/nvme0n1p3 225db677-dece-479b-a6f8-87dcc317a6e7:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 21:26:55.976149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.173152 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:26:55.976149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.174346 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.179953 2573 manager.go:217] Machine: {Timestamp:2026-04-24 21:26:55.178028085 +0000 UTC m=+0.457875855 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098407 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2587b9505c78dd530b7e7df629c6e4 SystemUUID:ec2587b9-505c-78dd-530b-7e7df629c6e4 BootID:a963a3cc-345d-4870-ad86-d98bba9d94c3 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7a:af:f2:c6:6f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7a:af:f2:c6:6f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:02:be:bd:b9:de:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.180708 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.180794 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.182029 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.182054 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-201.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.182201 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.182209 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.182222 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.183142 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.185748 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.186033 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.188647 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.188660 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.188673 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.188682 2573 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.188694 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.189889 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:55.976500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.189909 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.193282 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.195483 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197247 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197269 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197279 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197290 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197297 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197305 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197313 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197320 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197328 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197337 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197347 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.197358 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.198318 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.198325 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.204013 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.204046 2573 server.go:1295] "Started kubelet" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.204144 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.204186 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.204249 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.210329 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.210568 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-201.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.210933 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-201.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.210983 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.212732 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.215549 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-201.ec2.internal.18a9682022a79701 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-201.ec2.internal,UID:ip-10-0-136-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-201.ec2.internal,},FirstTimestamp:2026-04-24 21:26:55.204022017 +0000 UTC m=+0.483869788,LastTimestamp:2026-04-24 21:26:55.204022017 +0000 UTC m=+0.483869788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-201.ec2.internal,}" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.217864 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.218474 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.219088 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219208 2573 factory.go:153] Registering CRI-O factory Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219230 2573 factory.go:223] Registration of the crio container factory successfully Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219272 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219273 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219300 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219286 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219348 2573 factory.go:55] Registering systemd factory Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219357 2573 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219393 2573 factory.go:103] Registering Raw factory Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219403 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219411 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:26:55.977080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219412 2573 manager.go:1196] Started watching for new ooms in manager Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.219707 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.219764 2573 manager.go:319] Starting recovery of all containers Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.220854 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-201.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.232365 2573 manager.go:324] Recovery completed Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.235139 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.236019 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xpzss" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.237066 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.240266 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.240303 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.240318 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.240822 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.240832 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.240850 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.243482 2573 policy_none.go:49] "None policy: Start" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.243495 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.243505 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.250503 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-201.ec2.internal.18a9682024d0d781 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-201.ec2.internal,UID:ip-10-0-136-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-201.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-201.ec2.internal,},FirstTimestamp:2026-04-24 21:26:55.240279937 +0000 UTC m=+0.520127716,LastTimestamp:2026-04-24 21:26:55.240279937 +0000 UTC m=+0.520127716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-201.ec2.internal,}" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.255293 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xpzss" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.294620 2573 manager.go:341] "Starting Device Plugin manager" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.294651 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.294661 2573 server.go:85] "Starting device plugin registration server" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.294883 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.294894 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.295069 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.295139 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.295147 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.295622 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.295662 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.361766 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.362984 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.363012 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.363029 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.363036 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.363068 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.366962 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.395358 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.396580 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.396607 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:55.978197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.396617 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.396640 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.408092 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.408111 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-201.ec2.internal\": node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.451932 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.463955 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal"] Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.464039 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.466030 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.466054 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.466064 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.467439 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.467564 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.467594 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.468292 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.468320 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.468341 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.468395 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.468426 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.468439 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.469684 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.469705 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.471015 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.471041 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.471050 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.498460 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-201.ec2.internal\" not found" node="ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.510237 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-201.ec2.internal\" not found" node="ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.552573 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.622082 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bcf950b6b9eef658d8ab20236281bc71-config\") pod \"kube-apiserver-proxy-ip-10-0-136-201.ec2.internal\" (UID: \"bcf950b6b9eef658d8ab20236281bc71\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.622118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.622149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.652680 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:55.979227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.723246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bcf950b6b9eef658d8ab20236281bc71-config\") pod \"kube-apiserver-proxy-ip-10-0-136-201.ec2.internal\" (UID: \"bcf950b6b9eef658d8ab20236281bc71\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.723274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.723291 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.723339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.723349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bcf950b6b9eef658d8ab20236281bc71-config\") pod \"kube-apiserver-proxy-ip-10-0-136-201.ec2.internal\" (UID: \"bcf950b6b9eef658d8ab20236281bc71\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.723361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.753351 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.801539 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:55.813320 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.853640 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:55.980210 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:55.954154 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:56.054626 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.054595 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:26:56.068153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.068136 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:56.109729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.109704 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:26:56.109848 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.109829 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:56.109909 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.109862 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:56.118854 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.118837 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:26:56.143834 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.143808 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:56.145480 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.145463 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:26:56.159356 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.159338 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:56.189508 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.189483 2573 apiserver.go:52] "Watching apiserver" Apr 24 21:26:56.195855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.195837 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:26:56.197694 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.197309 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4","openshift-multus/multus-9zk7d","openshift-multus/network-metrics-daemon-jrhlr","openshift-network-diagnostics/network-check-target-lkml2","openshift-network-operator/iptables-alerter-w7rcc","kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal","openshift-cluster-node-tuning-operator/tuned-djtbw","openshift-dns/node-resolver-zvvmj","openshift-image-registry/node-ca-q9bbt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal","openshift-multus/multus-additional-cni-plugins-cjtjf","openshift-ovn-kubernetes/ovnkube-node-vz6wr","kube-system/konnectivity-agent-589nt"] Apr 24 21:26:56.199815 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.199796 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.201098 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.201080 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.203032 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.203016 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2qtt6\"" Apr 24 21:26:56.203138 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.203102 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:56.203198 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.203136 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:56.203249 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.203211 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:26:56.203750 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.203730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:26:56.203826 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.203792 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:26:56.203886 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.203798 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7gtpj\"" Apr 24 21:26:56.203952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.203891 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:26:56.203952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.203895 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:56.203952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.203937 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:26:56.204344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.204307 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:26:56.204412 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.204389 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:26:56.204472 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.204407 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.205512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.205495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.206507 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.206489 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:56.206653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.206634 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-752gp\"" Apr 24 21:26:56.206726 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.206662 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:56.206726 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.206693 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:26:56.206809 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.206758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.207600 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.207586 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:26:56.207691 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.207630 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jn2n\"" Apr 24 21:26:56.207691 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.207636 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:26:56.207691 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.207684 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:26:56.207942 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.207928 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.209113 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.209097 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.209309 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.209188 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:26:56.209468 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.209443 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:26:56.209534 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.209475 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ntglr\"" Apr 24 21:26:56.209940 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.209905 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:26:56.210033 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.209997 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:26:56.210177 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.210165 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:26:56.210484 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.210467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.210791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.210739 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mmf58\"" Apr 24 21:26:56.211535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.211506 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jztmk\"" Apr 24 21:26:56.211535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.211511 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:26:56.211949 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.211933 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:26:56.212001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.211989 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-589nt" Apr 24 21:26:56.214615 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.214595 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:26:56.214615 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.214612 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:26:56.214859 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.214834 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:26:56.215001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.214985 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:26:56.215074 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.215055 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:26:56.215137 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.215069 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:26:56.215500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.215479 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gv74c\"" Apr 24 21:26:56.215615 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.215520 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:26:56.215615 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.215546 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9cvct\"" Apr 24 21:26:56.215728 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.215700 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:26:56.217975 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.217961 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:56.220277 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.220262 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225563 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9msfx\" (UniqueName: \"kubernetes.io/projected/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-kube-api-access-9msfx\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-registration-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225667 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-run-openvswitch\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225718 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-run-ovn-kubernetes\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-ovnkube-script-lib\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225786 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cc88058c-aa01-4dec-b649-e759dc3c5b91-konnectivity-ca\") pod \"konnectivity-agent-589nt\" (UID: \"cc88058c-aa01-4dec-b649-e759dc3c5b91\") " pod="kube-system/konnectivity-agent-589nt" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225812 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-systemd\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-cni-dir\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225874 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-conf-dir\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-socket-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.225999 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-slash\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226025 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-log-socket\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-cni-bin\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226091 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59sgm\" (UniqueName: \"kubernetes.io/projected/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-kube-api-access-59sgm\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.226413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226123 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-cnibin\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2bs9\" (UniqueName: \"kubernetes.io/projected/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-kube-api-access-n2bs9\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226204 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-etc-selinux\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226279 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-etc-openvswitch\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc418569-4514-4b49-bd55-839ecdb097d5-tmp-dir\") pod \"node-resolver-zvvmj\" (UID: \"fc418569-4514-4b49-bd55-839ecdb097d5\") " pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-cni-binary-copy\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226399 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-os-release\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-hostroot\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvnh4\" (UniqueName: \"kubernetes.io/projected/932901de-5edd-4054-b5df-89077b36dd14-kube-api-access-wvnh4\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgrcw\" (UniqueName: \"kubernetes.io/projected/313a846d-a4f1-459e-b416-a695b875548d-kube-api-access-zgrcw\") pod \"node-ca-q9bbt\" (UID: \"313a846d-a4f1-459e-b416-a695b875548d\") " pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfrw\" (UniqueName: \"kubernetes.io/projected/ea6dd734-e6e3-4992-83c2-882f191366c0-kube-api-access-7qfrw\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-node-log\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-sys\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8gwr\" (UniqueName: \"kubernetes.io/projected/2d353477-699a-4613-82c7-27ddf9ec3b73-kube-api-access-s8gwr\") pod \"iptables-alerter-w7rcc\" (UID: \"2d353477-699a-4613-82c7-27ddf9ec3b73\") " pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/313a846d-a4f1-459e-b416-a695b875548d-host\") pod \"node-ca-q9bbt\" (UID: \"313a846d-a4f1-459e-b416-a695b875548d\") " pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.227287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-socket-dir-parent\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-run-k8s-cni-cncf-io\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226775 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-kubernetes\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226808 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-run\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226866 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226899 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-cni-binary-copy\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-var-lib-cni-multus\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.226970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-device-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227000 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cc88058c-aa01-4dec-b649-e759dc3c5b91-agent-certs\") pod \"konnectivity-agent-589nt\" (UID: \"cc88058c-aa01-4dec-b649-e759dc3c5b91\") " pod="kube-system/konnectivity-agent-589nt" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-sysctl-d\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227080 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-host\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-var-lib-cni-bin\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227216 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-daemon-config\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227262 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-tuned\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227352 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/313a846d-a4f1-459e-b416-a695b875548d-serviceca\") pod \"node-ca-q9bbt\" (UID: \"313a846d-a4f1-459e-b416-a695b875548d\") " pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-var-lib-kubelet\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.228111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-var-lib-openvswitch\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zxck\" (UniqueName: \"kubernetes.io/projected/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-kube-api-access-5zxck\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-tmp\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227639 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6f7\" (UniqueName: \"kubernetes.io/projected/fc418569-4514-4b49-bd55-839ecdb097d5-kube-api-access-4d6f7\") pod \"node-resolver-zvvmj\" (UID: \"fc418569-4514-4b49-bd55-839ecdb097d5\") " pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.227700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-cni-netd\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228290 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-cnibin\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228327 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-etc-kubernetes\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228360 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-sys-fs\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-kubelet\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228487 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-run-ovn\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-sysconfig\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fc418569-4514-4b49-bd55-839ecdb097d5-hosts-file\") pod \"node-resolver-zvvmj\" (UID: \"fc418569-4514-4b49-bd55-839ecdb097d5\") " pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228640 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-system-cni-dir\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228659 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-modprobe-d\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228707 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2d353477-699a-4613-82c7-27ddf9ec3b73-iptables-alerter-script\") pod \"iptables-alerter-w7rcc\" (UID: \"2d353477-699a-4613-82c7-27ddf9ec3b73\") " pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.228855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-run-netns\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-run-multus-certs\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.228831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-run-systemd\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-lib-modules\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229287 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-system-cni-dir\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-systemd-units\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229318 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-ovnkube-config\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229343 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-env-overrides\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229361 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-ovn-node-metrics-cert\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d353477-699a-4613-82c7-27ddf9ec3b73-host-slash\") pod \"iptables-alerter-w7rcc\" (UID: \"2d353477-699a-4613-82c7-27ddf9ec3b73\") " pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229413 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-os-release\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229461 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-run-netns\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-sysctl-conf\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.229576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.229534 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-var-lib-kubelet\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.257511 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.257476 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:21:55 +0000 UTC" deadline="2027-11-04 16:49:55.031234637 +0000 UTC" Apr 24 21:26:56.257511 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.257508 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13411h22m58.773729034s" Apr 24 21:26:56.257961 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.257945 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-96vfz" Apr 24 21:26:56.269433 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.269414 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-96vfz" Apr 24 21:26:56.300296 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.300261 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf950b6b9eef658d8ab20236281bc71.slice/crio-fe51990d420f9590e93f3a92e35bbbe9e61e89dd8b96e1aaf72776fbb7cb9427 WatchSource:0}: Error finding container fe51990d420f9590e93f3a92e35bbbe9e61e89dd8b96e1aaf72776fbb7cb9427: Status 404 returned error can't find the container with id fe51990d420f9590e93f3a92e35bbbe9e61e89dd8b96e1aaf72776fbb7cb9427 Apr 24 21:26:56.300948 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.300581 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eeb12378a06f0692d8b8888ada8c1bc.slice/crio-bdf0659e440e09d96eaa4bc50e6d2b3a2ecbd2b137f9055e975c01a325f9f359 WatchSource:0}: Error finding container bdf0659e440e09d96eaa4bc50e6d2b3a2ecbd2b137f9055e975c01a325f9f359: Status 404 returned error can't find the container with id bdf0659e440e09d96eaa4bc50e6d2b3a2ecbd2b137f9055e975c01a325f9f359 Apr 24 21:26:56.306127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.306103 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:26:56.327112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.327075 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:56.330189 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-cni-binary-copy\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.330286 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-var-lib-cni-multus\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.330286 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-device-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.330286 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cc88058c-aa01-4dec-b649-e759dc3c5b91-agent-certs\") pod \"konnectivity-agent-589nt\" (UID: \"cc88058c-aa01-4dec-b649-e759dc3c5b91\") " pod="kube-system/konnectivity-agent-589nt" Apr 24 21:26:56.330415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-device-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.330415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330306 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-sysctl-d\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.330415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-host\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.330415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.330415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-var-lib-cni-bin\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.330415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330407 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-daemon-config\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-host\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-tuned\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-var-lib-cni-bin\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/313a846d-a4f1-459e-b416-a695b875548d-serviceca\") pod \"node-ca-q9bbt\" (UID: \"313a846d-a4f1-459e-b416-a695b875548d\") " pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-var-lib-kubelet\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330515 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-sysctl-d\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-var-lib-openvswitch\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zxck\" (UniqueName: \"kubernetes.io/projected/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-kube-api-access-5zxck\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-tmp\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-var-lib-openvswitch\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6f7\" (UniqueName: \"kubernetes.io/projected/fc418569-4514-4b49-bd55-839ecdb097d5-kube-api-access-4d6f7\") pod \"node-resolver-zvvmj\" (UID: \"fc418569-4514-4b49-bd55-839ecdb097d5\") " pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330660 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-var-lib-kubelet\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.330733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330676 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.330845 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-var-lib-cni-multus\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331009 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/313a846d-a4f1-459e-b416-a695b875548d-serviceca\") pod \"node-ca-q9bbt\" (UID: \"313a846d-a4f1-459e-b416-a695b875548d\") " pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-daemon-config\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-cni-netd\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-cnibin\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-etc-kubernetes\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-sys-fs\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-cni-netd\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-kubelet\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-cnibin\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331300 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-cni-binary-copy\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-kubelet\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331302 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-run-ovn\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-etc-kubernetes\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331330 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-run-ovn\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-sysconfig\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.331344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-sys-fs\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fc418569-4514-4b49-bd55-839ecdb097d5-hosts-file\") pod \"node-resolver-zvvmj\" (UID: \"fc418569-4514-4b49-bd55-839ecdb097d5\") " pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331395 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-system-cni-dir\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331378 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-sysconfig\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-modprobe-d\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331446 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2d353477-699a-4613-82c7-27ddf9ec3b73-iptables-alerter-script\") pod \"iptables-alerter-w7rcc\" (UID: \"2d353477-699a-4613-82c7-27ddf9ec3b73\") " pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331463 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fc418569-4514-4b49-bd55-839ecdb097d5-hosts-file\") pod \"node-resolver-zvvmj\" (UID: \"fc418569-4514-4b49-bd55-839ecdb097d5\") " pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331470 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-run-netns\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-system-cni-dir\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331514 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-run-multus-certs\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331543 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-modprobe-d\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-run-systemd\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-run-netns\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-run-systemd\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-lib-modules\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331600 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-run-multus-certs\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-system-cni-dir\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331649 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-systemd-units\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.332164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-ovnkube-config\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-lib-modules\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-env-overrides\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-system-cni-dir\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-ovn-node-metrics-cert\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-systemd-units\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d353477-699a-4613-82c7-27ddf9ec3b73-host-slash\") pod \"iptables-alerter-w7rcc\" (UID: \"2d353477-699a-4613-82c7-27ddf9ec3b73\") " pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-os-release\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-run-netns\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-sysctl-conf\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331941 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-var-lib-kubelet\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9msfx\" (UniqueName: \"kubernetes.io/projected/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-kube-api-access-9msfx\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2d353477-699a-4613-82c7-27ddf9ec3b73-iptables-alerter-script\") pod \"iptables-alerter-w7rcc\" (UID: \"2d353477-699a-4613-82c7-27ddf9ec3b73\") " pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.331996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-registration-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332020 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-run-openvswitch\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.332971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-run-ovn-kubernetes\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-ovnkube-script-lib\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cc88058c-aa01-4dec-b649-e759dc3c5b91-konnectivity-ca\") pod \"konnectivity-agent-589nt\" (UID: \"cc88058c-aa01-4dec-b649-e759dc3c5b91\") " pod="kube-system/konnectivity-agent-589nt" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-systemd\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-env-overrides\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-cni-dir\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-conf-dir\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332238 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332236 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d353477-699a-4613-82c7-27ddf9ec3b73-host-slash\") pod \"iptables-alerter-w7rcc\" (UID: \"2d353477-699a-4613-82c7-27ddf9ec3b73\") " pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-socket-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-slash\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332326 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-os-release\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-log-socket\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332364 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-cni-bin\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59sgm\" (UniqueName: \"kubernetes.io/projected/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-kube-api-access-59sgm\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-cnibin\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-ovnkube-config\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.333721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2bs9\" (UniqueName: \"kubernetes.io/projected/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-kube-api-access-n2bs9\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-etc-selinux\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-etc-openvswitch\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc418569-4514-4b49-bd55-839ecdb097d5-tmp-dir\") pod \"node-resolver-zvvmj\" (UID: \"fc418569-4514-4b49-bd55-839ecdb097d5\") " pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-cni-binary-copy\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-os-release\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-hostroot\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvnh4\" (UniqueName: \"kubernetes.io/projected/932901de-5edd-4054-b5df-89077b36dd14-kube-api-access-wvnh4\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgrcw\" (UniqueName: \"kubernetes.io/projected/313a846d-a4f1-459e-b416-a695b875548d-kube-api-access-zgrcw\") pod \"node-ca-q9bbt\" (UID: \"313a846d-a4f1-459e-b416-a695b875548d\") " pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-os-release\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-etc-openvswitch\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-etc-selinux\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-hostroot\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-var-lib-kubelet\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.332952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc418569-4514-4b49-bd55-839ecdb097d5-tmp-dir\") pod \"node-resolver-zvvmj\" (UID: \"fc418569-4514-4b49-bd55-839ecdb097d5\") " pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-registration-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfrw\" (UniqueName: \"kubernetes.io/projected/ea6dd734-e6e3-4992-83c2-882f191366c0-kube-api-access-7qfrw\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333034 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-node-log\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.334427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-run-openvswitch\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333064 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-slash\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-sys\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-sys\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8gwr\" (UniqueName: \"kubernetes.io/projected/2d353477-699a-4613-82c7-27ddf9ec3b73-kube-api-access-s8gwr\") pod \"iptables-alerter-w7rcc\" (UID: \"2d353477-699a-4613-82c7-27ddf9ec3b73\") " pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/313a846d-a4f1-459e-b416-a695b875548d-host\") pod \"node-ca-q9bbt\" (UID: \"313a846d-a4f1-459e-b416-a695b875548d\") " pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333243 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-socket-dir-parent\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-run-k8s-cni-cncf-io\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-kubernetes\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333323 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-conf-dir\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333378 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-run\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-run-netns\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-sysctl-conf\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea6dd734-e6e3-4992-83c2-882f191366c0-socket-dir\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.334956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-run-ovn-kubernetes\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333327 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-run\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333641 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-cnibin\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333660 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333679 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-cni-bin\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.333761 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-ovn-node-metrics-cert\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-tuned\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/313a846d-a4f1-459e-b416-a695b875548d-host\") pod \"node-ca-q9bbt\" (UID: \"313a846d-a4f1-459e-b416-a695b875548d\") " pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333894 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-host-run-k8s-cni-cncf-io\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.334260 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs podName:932901de-5edd-4054-b5df-89077b36dd14 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:56.834222726 +0000 UTC m=+2.114070492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs") pod "network-metrics-daemon-jrhlr" (UID: "932901de-5edd-4054-b5df-89077b36dd14") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-log-socket\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334004 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334062 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-systemd\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334087 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-cni-dir\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334109 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-cni-binary-copy\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.335419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-multus-socket-dir-parent\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.335862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333825 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-node-log\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.335862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.333951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-etc-kubernetes\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.335862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334467 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cc88058c-aa01-4dec-b649-e759dc3c5b91-konnectivity-ca\") pod \"konnectivity-agent-589nt\" (UID: \"cc88058c-aa01-4dec-b649-e759dc3c5b91\") " pod="kube-system/konnectivity-agent-589nt" Apr 24 21:26:56.335862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cc88058c-aa01-4dec-b649-e759dc3c5b91-agent-certs\") pod \"konnectivity-agent-589nt\" (UID: \"cc88058c-aa01-4dec-b649-e759dc3c5b91\") " pod="kube-system/konnectivity-agent-589nt" Apr 24 21:26:56.335862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-tmp\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.335862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.334488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-ovnkube-script-lib\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.340751 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.340735 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:56.340814 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.340754 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:56.340814 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.340767 2573 projected.go:194] Error preparing data for projected volume kube-api-access-zpdtv for pod openshift-network-diagnostics/network-check-target-lkml2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:56.340900 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.340864 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv podName:8b12b6f6-d28f-4cda-9380-4efcca507494 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:56.840851363 +0000 UTC m=+2.120699143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zpdtv" (UniqueName: "kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv") pod "network-check-target-lkml2" (UID: "8b12b6f6-d28f-4cda-9380-4efcca507494") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:56.342238 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.342217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgrcw\" (UniqueName: \"kubernetes.io/projected/313a846d-a4f1-459e-b416-a695b875548d-kube-api-access-zgrcw\") pod \"node-ca-q9bbt\" (UID: \"313a846d-a4f1-459e-b416-a695b875548d\") " pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.342686 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.342663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zxck\" (UniqueName: \"kubernetes.io/projected/58e1e5f8-ee7a-4e0f-87fc-b18349e28725-kube-api-access-5zxck\") pod \"ovnkube-node-vz6wr\" (UID: \"58e1e5f8-ee7a-4e0f-87fc-b18349e28725\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.359158 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.359128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfrw\" (UniqueName: \"kubernetes.io/projected/ea6dd734-e6e3-4992-83c2-882f191366c0-kube-api-access-7qfrw\") pod \"aws-ebs-csi-driver-node-5cxx4\" (UID: \"ea6dd734-e6e3-4992-83c2-882f191366c0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.361971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.359811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59sgm\" (UniqueName: \"kubernetes.io/projected/57fbfaf0-c0df-4a08-9387-11b04cf5ba29-kube-api-access-59sgm\") pod \"tuned-djtbw\" (UID: \"57fbfaf0-c0df-4a08-9387-11b04cf5ba29\") " pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.361971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.360397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvnh4\" (UniqueName: \"kubernetes.io/projected/932901de-5edd-4054-b5df-89077b36dd14-kube-api-access-wvnh4\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:56.362839 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.362819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2bs9\" (UniqueName: \"kubernetes.io/projected/fe0039f5-a4d2-42e2-86f7-be764dcf37fd-kube-api-access-n2bs9\") pod \"multus-9zk7d\" (UID: \"fe0039f5-a4d2-42e2-86f7-be764dcf37fd\") " pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.363060 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.363035 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9msfx\" (UniqueName: \"kubernetes.io/projected/c8bc6b6e-12b3-4cd5-83f8-09faef4eb787-kube-api-access-9msfx\") pod \"multus-additional-cni-plugins-cjtjf\" (UID: \"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787\") " pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.363123 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.363088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6f7\" (UniqueName: \"kubernetes.io/projected/fc418569-4514-4b49-bd55-839ecdb097d5-kube-api-access-4d6f7\") pod \"node-resolver-zvvmj\" (UID: \"fc418569-4514-4b49-bd55-839ecdb097d5\") " pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.363261 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.363244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8gwr\" (UniqueName: \"kubernetes.io/projected/2d353477-699a-4613-82c7-27ddf9ec3b73-kube-api-access-s8gwr\") pod \"iptables-alerter-w7rcc\" (UID: \"2d353477-699a-4613-82c7-27ddf9ec3b73\") " pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.365650 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.365613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" event={"ID":"bcf950b6b9eef658d8ab20236281bc71","Type":"ContainerStarted","Data":"fe51990d420f9590e93f3a92e35bbbe9e61e89dd8b96e1aaf72776fbb7cb9427"} Apr 24 21:26:56.366534 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.366506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" event={"ID":"0eeb12378a06f0692d8b8888ada8c1bc","Type":"ContainerStarted","Data":"bdf0659e440e09d96eaa4bc50e6d2b3a2ecbd2b137f9055e975c01a325f9f359"} Apr 24 21:26:56.516653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.516578 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:56.520749 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.520728 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-djtbw" Apr 24 21:26:56.527265 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.527242 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57fbfaf0_c0df_4a08_9387_11b04cf5ba29.slice/crio-c9aecd9e14f367a5dbdd462f49b4e82172e5f12256bd6faec3e9e4a85921390a WatchSource:0}: Error finding container c9aecd9e14f367a5dbdd462f49b4e82172e5f12256bd6faec3e9e4a85921390a: Status 404 returned error can't find the container with id c9aecd9e14f367a5dbdd462f49b4e82172e5f12256bd6faec3e9e4a85921390a Apr 24 21:26:56.528973 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.528959 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9zk7d" Apr 24 21:26:56.536225 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.536206 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe0039f5_a4d2_42e2_86f7_be764dcf37fd.slice/crio-a85cbacdf5f7ab1a76de164ebac59b21833b4be05d2dd405dc6f3b06841db7b7 WatchSource:0}: Error finding container a85cbacdf5f7ab1a76de164ebac59b21833b4be05d2dd405dc6f3b06841db7b7: Status 404 returned error can't find the container with id a85cbacdf5f7ab1a76de164ebac59b21833b4be05d2dd405dc6f3b06841db7b7 Apr 24 21:26:56.542015 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.541988 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w7rcc" Apr 24 21:26:56.549122 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.549096 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d353477_699a_4613_82c7_27ddf9ec3b73.slice/crio-030a986fce5df7beb591335dc296b7bcf93e3ee8b83d24146361bc6d5ca4325c WatchSource:0}: Error finding container 030a986fce5df7beb591335dc296b7bcf93e3ee8b83d24146361bc6d5ca4325c: Status 404 returned error can't find the container with id 030a986fce5df7beb591335dc296b7bcf93e3ee8b83d24146361bc6d5ca4325c Apr 24 21:26:56.556983 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.556966 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" Apr 24 21:26:56.563737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.563713 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6dd734_e6e3_4992_83c2_882f191366c0.slice/crio-25d52ac0fa8623722b798635b2b659eeeda013a73832f4b9e427ccfdf594871e WatchSource:0}: Error finding container 25d52ac0fa8623722b798635b2b659eeeda013a73832f4b9e427ccfdf594871e: Status 404 returned error can't find the container with id 25d52ac0fa8623722b798635b2b659eeeda013a73832f4b9e427ccfdf594871e Apr 24 21:26:56.567614 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.567596 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zvvmj" Apr 24 21:26:56.573089 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.573065 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc418569_4514_4b49_bd55_839ecdb097d5.slice/crio-edeaed156e11f41128124e83e4d7a3021a764a1aaf4a3f3c549e70b5e9ddebfe WatchSource:0}: Error finding container edeaed156e11f41128124e83e4d7a3021a764a1aaf4a3f3c549e70b5e9ddebfe: Status 404 returned error can't find the container with id edeaed156e11f41128124e83e4d7a3021a764a1aaf4a3f3c549e70b5e9ddebfe Apr 24 21:26:56.597297 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.597271 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q9bbt" Apr 24 21:26:56.603225 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.603203 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod313a846d_a4f1_459e_b416_a695b875548d.slice/crio-ad73e63062dae29d6d97529fa4aeb6cb60588ffe272a3ffeba935083361f2305 WatchSource:0}: Error finding container ad73e63062dae29d6d97529fa4aeb6cb60588ffe272a3ffeba935083361f2305: Status 404 returned error can't find the container with id ad73e63062dae29d6d97529fa4aeb6cb60588ffe272a3ffeba935083361f2305 Apr 24 21:26:56.604898 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.604876 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" Apr 24 21:26:56.610376 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.610352 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8bc6b6e_12b3_4cd5_83f8_09faef4eb787.slice/crio-43a992f93a83e24ba916a2b4555a782720df22f8abcc9e5639f4951aeeea8dab WatchSource:0}: Error finding container 43a992f93a83e24ba916a2b4555a782720df22f8abcc9e5639f4951aeeea8dab: Status 404 returned error can't find the container with id 43a992f93a83e24ba916a2b4555a782720df22f8abcc9e5639f4951aeeea8dab Apr 24 21:26:56.611880 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.611860 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:26:56.616267 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.616248 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-589nt" Apr 24 21:26:56.618081 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.618061 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58e1e5f8_ee7a_4e0f_87fc_b18349e28725.slice/crio-b0a50dec9c5758d99b1d2222a761bb30bf5026eb6abf412afa9432868ffb3830 WatchSource:0}: Error finding container b0a50dec9c5758d99b1d2222a761bb30bf5026eb6abf412afa9432868ffb3830: Status 404 returned error can't find the container with id b0a50dec9c5758d99b1d2222a761bb30bf5026eb6abf412afa9432868ffb3830 Apr 24 21:26:56.622465 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:26:56.622443 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc88058c_aa01_4dec_b649_e759dc3c5b91.slice/crio-40b246baf8d8449d14880fea9ff4e8a84052f174ff91336d5fed0f66d23f2722 WatchSource:0}: Error finding container 40b246baf8d8449d14880fea9ff4e8a84052f174ff91336d5fed0f66d23f2722: Status 404 returned error can't find the container with id 40b246baf8d8449d14880fea9ff4e8a84052f174ff91336d5fed0f66d23f2722 Apr 24 21:26:56.837775 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.837567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:56.837775 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.837730 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:56.837998 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.837801 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs podName:932901de-5edd-4054-b5df-89077b36dd14 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:57.83778292 +0000 UTC m=+3.117630678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs") pod "network-metrics-daemon-jrhlr" (UID: "932901de-5edd-4054-b5df-89077b36dd14") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:56.938988 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:56.938894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:26:56.939163 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.939114 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:56.939163 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.939133 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:56.939163 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.939146 2573 projected.go:194] Error preparing data for projected volume kube-api-access-zpdtv for pod openshift-network-diagnostics/network-check-target-lkml2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:56.939314 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:56.939207 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv podName:8b12b6f6-d28f-4cda-9380-4efcca507494 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:57.939188126 +0000 UTC m=+3.219035903 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zpdtv" (UniqueName: "kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv") pod "network-check-target-lkml2" (UID: "8b12b6f6-d28f-4cda-9380-4efcca507494") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:57.080289 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.080258 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:57.271043 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.270885 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:56 +0000 UTC" deadline="2027-11-21 19:26:05.665322468 +0000 UTC" Apr 24 21:26:57.271043 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.270945 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13821h59m8.39438319s" Apr 24 21:26:57.365991 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.365950 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:57.366439 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:57.366086 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:26:57.386975 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.386927 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerStarted","Data":"b0a50dec9c5758d99b1d2222a761bb30bf5026eb6abf412afa9432868ffb3830"} Apr 24 21:26:57.396997 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.396967 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q9bbt" event={"ID":"313a846d-a4f1-459e-b416-a695b875548d","Type":"ContainerStarted","Data":"ad73e63062dae29d6d97529fa4aeb6cb60588ffe272a3ffeba935083361f2305"} Apr 24 21:26:57.406094 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.405957 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zvvmj" event={"ID":"fc418569-4514-4b49-bd55-839ecdb097d5","Type":"ContainerStarted","Data":"edeaed156e11f41128124e83e4d7a3021a764a1aaf4a3f3c549e70b5e9ddebfe"} Apr 24 21:26:57.412367 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.412342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-djtbw" event={"ID":"57fbfaf0-c0df-4a08-9387-11b04cf5ba29","Type":"ContainerStarted","Data":"c9aecd9e14f367a5dbdd462f49b4e82172e5f12256bd6faec3e9e4a85921390a"} Apr 24 21:26:57.421265 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.421237 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-589nt" event={"ID":"cc88058c-aa01-4dec-b649-e759dc3c5b91","Type":"ContainerStarted","Data":"40b246baf8d8449d14880fea9ff4e8a84052f174ff91336d5fed0f66d23f2722"} Apr 24 21:26:57.429952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.429903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" event={"ID":"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787","Type":"ContainerStarted","Data":"43a992f93a83e24ba916a2b4555a782720df22f8abcc9e5639f4951aeeea8dab"} Apr 24 21:26:57.470366 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.470283 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" event={"ID":"ea6dd734-e6e3-4992-83c2-882f191366c0","Type":"ContainerStarted","Data":"25d52ac0fa8623722b798635b2b659eeeda013a73832f4b9e427ccfdf594871e"} Apr 24 21:26:57.487765 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.487718 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w7rcc" event={"ID":"2d353477-699a-4613-82c7-27ddf9ec3b73","Type":"ContainerStarted","Data":"030a986fce5df7beb591335dc296b7bcf93e3ee8b83d24146361bc6d5ca4325c"} Apr 24 21:26:57.500834 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.500775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9zk7d" event={"ID":"fe0039f5-a4d2-42e2-86f7-be764dcf37fd","Type":"ContainerStarted","Data":"a85cbacdf5f7ab1a76de164ebac59b21833b4be05d2dd405dc6f3b06841db7b7"} Apr 24 21:26:57.844946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.844890 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:57.845133 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:57.845057 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:57.845191 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:57.845138 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs podName:932901de-5edd-4054-b5df-89077b36dd14 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:59.845115326 +0000 UTC m=+5.124963101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs") pod "network-metrics-daemon-jrhlr" (UID: "932901de-5edd-4054-b5df-89077b36dd14") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:57.946378 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:57.946346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:26:57.946563 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:57.946503 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:57.946563 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:57.946524 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:57.946563 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:57.946535 2573 projected.go:194] Error preparing data for projected volume kube-api-access-zpdtv for pod openshift-network-diagnostics/network-check-target-lkml2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:57.946712 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:57.946587 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv podName:8b12b6f6-d28f-4cda-9380-4efcca507494 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:59.946570826 +0000 UTC m=+5.226418601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zpdtv" (UniqueName: "kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv") pod "network-check-target-lkml2" (UID: "8b12b6f6-d28f-4cda-9380-4efcca507494") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:58.256889 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:58.256797 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:58.272373 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:58.271773 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:56 +0000 UTC" deadline="2027-10-15 18:14:37.974761284 +0000 UTC" Apr 24 21:26:58.272373 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:58.271807 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12932h47m39.702958943s" Apr 24 21:26:58.363304 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:58.363270 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:26:58.363480 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:58.363403 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:26:59.373347 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:59.370455 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:59.373347 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:59.370630 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:26:59.864828 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:59.864746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:26:59.865033 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:59.864913 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:59.865033 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:59.864999 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs podName:932901de-5edd-4054-b5df-89077b36dd14 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.864978521 +0000 UTC m=+9.144826282 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs") pod "network-metrics-daemon-jrhlr" (UID: "932901de-5edd-4054-b5df-89077b36dd14") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:59.966138 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:26:59.965548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:26:59.966138 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:59.965700 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:59.966138 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:59.965716 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:59.966138 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:59.965729 2573 projected.go:194] Error preparing data for projected volume kube-api-access-zpdtv for pod openshift-network-diagnostics/network-check-target-lkml2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:59.966138 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:26:59.965785 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv podName:8b12b6f6-d28f-4cda-9380-4efcca507494 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.965765713 +0000 UTC m=+9.245613490 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zpdtv" (UniqueName: "kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv") pod "network-check-target-lkml2" (UID: "8b12b6f6-d28f-4cda-9380-4efcca507494") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:00.364194 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:00.363878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:00.364381 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:00.364251 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:01.364140 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:01.364070 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:01.364627 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:01.364282 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:02.364361 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:02.364325 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:02.364839 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:02.364469 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:03.363842 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.363806 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:03.364025 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.363955 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:03.900707 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.900667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:03.901148 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.900859 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:03.901148 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.900951 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs podName:932901de-5edd-4054-b5df-89077b36dd14 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:11.900931073 +0000 UTC m=+17.180778845 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs") pod "network-metrics-daemon-jrhlr" (UID: "932901de-5edd-4054-b5df-89077b36dd14") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:04.001877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.001833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:04.002068 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.002023 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:04.002068 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.002043 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:04.002068 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.002055 2573 projected.go:194] Error preparing data for projected volume kube-api-access-zpdtv for pod openshift-network-diagnostics/network-check-target-lkml2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:04.002230 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.002114 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv podName:8b12b6f6-d28f-4cda-9380-4efcca507494 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:12.002096436 +0000 UTC m=+17.281944199 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zpdtv" (UniqueName: "kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv") pod "network-check-target-lkml2" (UID: "8b12b6f6-d28f-4cda-9380-4efcca507494") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:04.364102 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.364019 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:04.364245 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.364176 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:05.367330 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366470 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:05.367330 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.367113 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:06.363874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.363836 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:06.364078 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:06.363955 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:07.363825 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:07.363792 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:07.364281 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:07.363929 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:08.363556 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:08.363517 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:08.363755 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:08.363659 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:09.364209 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:09.364173 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:09.364642 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:09.364307 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:10.363301 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:10.363264 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:10.363468 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:10.363382 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:11.363736 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:11.363705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:11.364193 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:11.363834 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:11.956761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:11.956722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:11.957000 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:11.956905 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:11.957049 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:11.957028 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs podName:932901de-5edd-4054-b5df-89077b36dd14 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.95700545 +0000 UTC m=+33.236853223 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs") pod "network-metrics-daemon-jrhlr" (UID: "932901de-5edd-4054-b5df-89077b36dd14") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:12.057954 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:12.057895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:12.058136 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:12.058065 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:12.058136 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:12.058089 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:12.058136 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:12.058102 2573 projected.go:194] Error preparing data for projected volume kube-api-access-zpdtv for pod openshift-network-diagnostics/network-check-target-lkml2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:12.058275 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:12.058164 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv podName:8b12b6f6-d28f-4cda-9380-4efcca507494 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:28.058146201 +0000 UTC m=+33.337993963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zpdtv" (UniqueName: "kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv") pod "network-check-target-lkml2" (UID: "8b12b6f6-d28f-4cda-9380-4efcca507494") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:12.363715 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:12.363624 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:12.363881 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:12.363751 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:13.363898 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:13.363854 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:13.364278 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:13.364019 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:14.363433 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:14.363394 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:14.363613 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:14.363528 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:15.364622 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.364387 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:15.365305 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:15.364727 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:15.537562 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.537519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9zk7d" event={"ID":"fe0039f5-a4d2-42e2-86f7-be764dcf37fd","Type":"ContainerStarted","Data":"4806b4b489457d35e8d03faefe3f1bf19f67abf34cc43b367bc9886d46ab9a93"} Apr 24 21:27:15.539524 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.539499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" event={"ID":"bcf950b6b9eef658d8ab20236281bc71","Type":"ContainerStarted","Data":"0ddffc5535be3ee96bfb602c26b05422d0099e921e7f8dbca8bfba6464caa7d1"} Apr 24 21:27:15.542587 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.542509 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:27:15.542873 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.542849 2573 generic.go:358] "Generic (PLEG): container finished" podID="58e1e5f8-ee7a-4e0f-87fc-b18349e28725" containerID="6003a1a008b4c2d6a79591a9ccf5cb42ca75d65e3e248089def07f59f73c20eb" exitCode=1 Apr 24 21:27:15.542947 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.542881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerStarted","Data":"7852468a60382b14b875de525627e29abb10cb3f1a00979b4b99286245d2c2f9"} Apr 24 21:27:15.543004 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.542959 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerStarted","Data":"d6bbbc89b845d192ec2ac860609261d6c96e3ae414cb3292cbf85c628c0a44bd"} Apr 24 21:27:15.543004 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.542972 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerStarted","Data":"3831b6ea1e621ac3dc2208279b268b9f41cc923bd6136e9fe3d946fd006b9200"} Apr 24 21:27:15.543004 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.542981 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerStarted","Data":"3bb46c7cd82a35ccac2356b58e2b4df2b5a5366bbcbb8dc146490e280d672728"} Apr 24 21:27:15.543004 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.542989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerDied","Data":"6003a1a008b4c2d6a79591a9ccf5cb42ca75d65e3e248089def07f59f73c20eb"} Apr 24 21:27:15.543004 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.542999 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerStarted","Data":"e808362a94160d88e30dbb51ed7b321408a36857845674b5da7386618c62f733"} Apr 24 21:27:15.544260 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.544240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-djtbw" event={"ID":"57fbfaf0-c0df-4a08-9387-11b04cf5ba29","Type":"ContainerStarted","Data":"4bd91ff0699378cef3570d6e8253b845f956cdaeda7c990f0d1008aea221e317"} Apr 24 21:27:15.560803 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.560744 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9zk7d" podStartSLOduration=2.154402904 podStartE2EDuration="20.560727277s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:26:56.537584862 +0000 UTC m=+1.817432620" lastFinishedPulling="2026-04-24 21:27:14.943909231 +0000 UTC m=+20.223756993" observedRunningTime="2026-04-24 21:27:15.560036091 +0000 UTC m=+20.839883881" watchObservedRunningTime="2026-04-24 21:27:15.560727277 +0000 UTC m=+20.840575058" Apr 24 21:27:15.585162 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.585125 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-djtbw" podStartSLOduration=2.503575858 podStartE2EDuration="20.585112092s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:26:56.529059986 +0000 UTC m=+1.808907744" lastFinishedPulling="2026-04-24 21:27:14.610596213 +0000 UTC m=+19.890443978" observedRunningTime="2026-04-24 21:27:15.584964134 +0000 UTC m=+20.864811914" watchObservedRunningTime="2026-04-24 21:27:15.585112092 +0000 UTC m=+20.864959872" Apr 24 21:27:15.622352 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.621650 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" podStartSLOduration=19.621627571 podStartE2EDuration="19.621627571s" podCreationTimestamp="2026-04-24 21:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:15.62067348 +0000 UTC m=+20.900521261" watchObservedRunningTime="2026-04-24 21:27:15.621627571 +0000 UTC m=+20.901475352" Apr 24 21:27:16.363634 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.363567 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:16.363769 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:16.363667 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:16.547065 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.547026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" event={"ID":"ea6dd734-e6e3-4992-83c2-882f191366c0","Type":"ContainerStarted","Data":"c932ff9d46f88843282383db693a458114b0cdccf51c847fffa875cb371030a0"} Apr 24 21:27:16.548317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.548292 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w7rcc" event={"ID":"2d353477-699a-4613-82c7-27ddf9ec3b73","Type":"ContainerStarted","Data":"57f1b43ac91ad7d305cf61b0946da7f6dd91515a21241307faa2d576220b5b67"} Apr 24 21:27:16.549545 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.549527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q9bbt" event={"ID":"313a846d-a4f1-459e-b416-a695b875548d","Type":"ContainerStarted","Data":"5a288403c8b601f111e0f116bc1e6c0f1683c984e18f4ddeee6375d96116dc4d"} Apr 24 21:27:16.550716 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.550694 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zvvmj" event={"ID":"fc418569-4514-4b49-bd55-839ecdb097d5","Type":"ContainerStarted","Data":"05781c3e420c84a781dbc780db8aeca4f177cf30708afe29c79c2d2c38999982"} Apr 24 21:27:16.552066 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.552043 2573 generic.go:358] "Generic (PLEG): container finished" podID="0eeb12378a06f0692d8b8888ada8c1bc" containerID="101dfd8e4265a7fd1105fa7aa0eb29e25115fed2221458b7c79722883fe0697c" exitCode=0 Apr 24 21:27:16.552137 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.552119 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" event={"ID":"0eeb12378a06f0692d8b8888ada8c1bc","Type":"ContainerDied","Data":"101dfd8e4265a7fd1105fa7aa0eb29e25115fed2221458b7c79722883fe0697c"} Apr 24 21:27:16.553298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.553270 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-589nt" event={"ID":"cc88058c-aa01-4dec-b649-e759dc3c5b91","Type":"ContainerStarted","Data":"880da197453176ac18bc41ae787396b74b3da578acb7bb98f81c5a920614bb1c"} Apr 24 21:27:16.554610 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.554588 2573 generic.go:358] "Generic (PLEG): container finished" podID="c8bc6b6e-12b3-4cd5-83f8-09faef4eb787" containerID="6db57d854dd0a26f80bfb59f78250bc1a25ba527e4dacd0914b34cc705155b7c" exitCode=0 Apr 24 21:27:16.554692 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.554629 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" event={"ID":"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787","Type":"ContainerDied","Data":"6db57d854dd0a26f80bfb59f78250bc1a25ba527e4dacd0914b34cc705155b7c"} Apr 24 21:27:16.566979 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.566938 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-w7rcc" podStartSLOduration=3.5091849379999998 podStartE2EDuration="21.566911505s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:26:56.550740053 +0000 UTC m=+1.830587811" lastFinishedPulling="2026-04-24 21:27:14.608466613 +0000 UTC m=+19.888314378" observedRunningTime="2026-04-24 21:27:16.566538318 +0000 UTC m=+21.846386100" watchObservedRunningTime="2026-04-24 21:27:16.566911505 +0000 UTC m=+21.846759284" Apr 24 21:27:16.586787 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.586729 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zvvmj" podStartSLOduration=3.551112689 podStartE2EDuration="21.586708577s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:26:56.574430564 +0000 UTC m=+1.854278327" lastFinishedPulling="2026-04-24 21:27:14.610026454 +0000 UTC m=+19.889874215" observedRunningTime="2026-04-24 21:27:16.582378322 +0000 UTC m=+21.862226103" watchObservedRunningTime="2026-04-24 21:27:16.586708577 +0000 UTC m=+21.866556358" Apr 24 21:27:16.625963 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.625838 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q9bbt" podStartSLOduration=3.621435851 podStartE2EDuration="21.625821215s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:26:56.604639701 +0000 UTC m=+1.884487460" lastFinishedPulling="2026-04-24 21:27:14.609025059 +0000 UTC m=+19.888872824" observedRunningTime="2026-04-24 21:27:16.602623486 +0000 UTC m=+21.882471267" watchObservedRunningTime="2026-04-24 21:27:16.625821215 +0000 UTC m=+21.905668996" Apr 24 21:27:16.663656 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.663603 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-589nt" podStartSLOduration=3.678947782 podStartE2EDuration="21.663584724s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:26:56.624067505 +0000 UTC m=+1.903915275" lastFinishedPulling="2026-04-24 21:27:14.608704458 +0000 UTC m=+19.888552217" observedRunningTime="2026-04-24 21:27:16.645686256 +0000 UTC m=+21.925534036" watchObservedRunningTime="2026-04-24 21:27:16.663584724 +0000 UTC m=+21.943432508" Apr 24 21:27:16.693757 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.693723 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:17.306662 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.306544 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:16.693742449Z","UUID":"dbf24225-dc43-4eb0-814b-7d4100c5d1a3","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:17.308620 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.308592 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:17.308620 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.308621 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:17.364311 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.364278 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:17.364449 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:17.364425 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:17.558456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.558357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" event={"ID":"0eeb12378a06f0692d8b8888ada8c1bc","Type":"ContainerStarted","Data":"23411466d46682e72b3e1420470f3efde674d56cef9b78274add325c23670a15"} Apr 24 21:27:17.560295 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.560269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" event={"ID":"ea6dd734-e6e3-4992-83c2-882f191366c0","Type":"ContainerStarted","Data":"295184e0ac1d0c4a7a43b834b2c8914257e21e530d59055c6c31a91491189d06"} Apr 24 21:27:17.563779 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.563710 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:27:17.564898 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.564497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerStarted","Data":"62ce31b88cf26c706444da5e5abc77f271493581381f85a00b2a94848aade4c4"} Apr 24 21:27:17.575353 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.575296 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" podStartSLOduration=21.575279691 podStartE2EDuration="21.575279691s" podCreationTimestamp="2026-04-24 21:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:17.574732845 +0000 UTC m=+22.854580625" watchObservedRunningTime="2026-04-24 21:27:17.575279691 +0000 UTC m=+22.855127472" Apr 24 21:27:18.363933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:18.363889 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:18.364119 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:18.364037 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:18.568892 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:18.568851 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" event={"ID":"ea6dd734-e6e3-4992-83c2-882f191366c0","Type":"ContainerStarted","Data":"39cc8b404ff132c1d6ef5136d0c0bb6d122abd014bbbb32955ddf0f7ef958394"} Apr 24 21:27:18.590844 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:18.590786 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cxx4" podStartSLOduration=2.701262817 podStartE2EDuration="23.590765825s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:26:56.565230788 +0000 UTC m=+1.845078545" lastFinishedPulling="2026-04-24 21:27:17.454733782 +0000 UTC m=+22.734581553" observedRunningTime="2026-04-24 21:27:18.589859402 +0000 UTC m=+23.869707181" watchObservedRunningTime="2026-04-24 21:27:18.590765825 +0000 UTC m=+23.870613607" Apr 24 21:27:19.363597 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:19.363373 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:19.363774 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:19.363721 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:20.364005 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.363908 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:20.364454 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:20.364039 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:20.576105 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.575955 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:27:20.576456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.576431 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerStarted","Data":"6027b92cff2d0cb6c67c0a6628fc5474d2169f5597a0c5503c85548c47c1c21d"} Apr 24 21:27:20.576776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.576746 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:27:20.576896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.576839 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:27:20.576896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.576861 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:27:20.577028 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.577012 2573 scope.go:117] "RemoveContainer" containerID="6003a1a008b4c2d6a79591a9ccf5cb42ca75d65e3e248089def07f59f73c20eb" Apr 24 21:27:20.595958 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.595936 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:27:20.596258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.596238 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:27:20.758016 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.757979 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-589nt" Apr 24 21:27:20.758636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.758615 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-589nt" Apr 24 21:27:21.363645 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.363611 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:21.363827 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:21.363732 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:21.580724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.580699 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:27:21.581353 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.581034 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" event={"ID":"58e1e5f8-ee7a-4e0f-87fc-b18349e28725","Type":"ContainerStarted","Data":"4ec86d2dc10444290f48c4678686b5dd68e35c6e51f2f4efc671ecfa56d71601"} Apr 24 21:27:21.582525 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.582503 2573 generic.go:358] "Generic (PLEG): container finished" podID="c8bc6b6e-12b3-4cd5-83f8-09faef4eb787" containerID="870b17b64052709109f7854e552a163dd5715eb3522f0875bba722d288f2b572" exitCode=0 Apr 24 21:27:21.582651 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.582572 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" event={"ID":"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787","Type":"ContainerDied","Data":"870b17b64052709109f7854e552a163dd5715eb3522f0875bba722d288f2b572"} Apr 24 21:27:21.582799 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.582787 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-589nt" Apr 24 21:27:21.583468 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.583262 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-589nt" Apr 24 21:27:21.619241 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.619161 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" podStartSLOduration=8.355798796 podStartE2EDuration="26.619148069s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:26:56.619545678 +0000 UTC m=+1.899393436" lastFinishedPulling="2026-04-24 21:27:14.882894952 +0000 UTC m=+20.162742709" observedRunningTime="2026-04-24 21:27:21.617408473 +0000 UTC m=+26.897256274" watchObservedRunningTime="2026-04-24 21:27:21.619148069 +0000 UTC m=+26.898995882" Apr 24 21:27:22.363513 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.363490 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:22.363636 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:22.363591 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:22.585936 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.585881 2573 generic.go:358] "Generic (PLEG): container finished" podID="c8bc6b6e-12b3-4cd5-83f8-09faef4eb787" containerID="6a06a927c3aaf5786706b1696d28084162acd397ec61dc655c3422adc9e88b51" exitCode=0 Apr 24 21:27:22.586341 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.585963 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" event={"ID":"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787","Type":"ContainerDied","Data":"6a06a927c3aaf5786706b1696d28084162acd397ec61dc655c3422adc9e88b51"} Apr 24 21:27:22.641962 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.641904 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lkml2"] Apr 24 21:27:22.642117 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.642032 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:22.642117 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:22.642109 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:22.645768 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.645744 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jrhlr"] Apr 24 21:27:22.645889 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.645838 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:22.645965 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:22.645908 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:23.593725 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.593695 2573 generic.go:358] "Generic (PLEG): container finished" podID="c8bc6b6e-12b3-4cd5-83f8-09faef4eb787" containerID="ce8c6e682ca2499e02b3a0cba0fdb38a025e47267cca27d789f340a75cffd984" exitCode=0 Apr 24 21:27:23.594090 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.593824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" event={"ID":"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787","Type":"ContainerDied","Data":"ce8c6e682ca2499e02b3a0cba0fdb38a025e47267cca27d789f340a75cffd984"} Apr 24 21:27:24.363584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:24.363554 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:24.363584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:24.363578 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:24.363845 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:24.363669 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:24.363845 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:24.363748 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:26.363607 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:26.363573 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:26.364245 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:26.363588 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:26.364245 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:26.363700 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lkml2" podUID="8b12b6f6-d28f-4cda-9380-4efcca507494" Apr 24 21:27:26.364245 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:26.363817 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:27:27.977102 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:27.977006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:27.977657 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:27.977176 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:27.977657 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:27.977254 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs podName:932901de-5edd-4054-b5df-89077b36dd14 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:59.977233098 +0000 UTC m=+65.257080861 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs") pod "network-metrics-daemon-jrhlr" (UID: "932901de-5edd-4054-b5df-89077b36dd14") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:28.016104 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.016074 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeReady" Apr 24 21:27:28.016262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.016235 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:27:28.074441 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.074411 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-n6xn7"] Apr 24 21:27:28.078239 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.078213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:28.078389 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.078341 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:28.078389 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.078355 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:28.078389 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.078365 2573 projected.go:194] Error preparing data for projected volume kube-api-access-zpdtv for pod openshift-network-diagnostics/network-check-target-lkml2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:28.078542 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.078406 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv podName:8b12b6f6-d28f-4cda-9380-4efcca507494 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:00.07839343 +0000 UTC m=+65.358241188 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zpdtv" (UniqueName: "kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv") pod "network-check-target-lkml2" (UID: "8b12b6f6-d28f-4cda-9380-4efcca507494") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:28.108836 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.108805 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6l98d"] Apr 24 21:27:28.109031 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.109010 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.113861 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.113749 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:27:28.113861 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.113757 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:27:28.114066 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.114009 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mzt4d\"" Apr 24 21:27:28.123946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.123907 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n6xn7"] Apr 24 21:27:28.123946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.123951 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6l98d"] Apr 24 21:27:28.124120 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.124089 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:28.126952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.126908 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-42hwh\"" Apr 24 21:27:28.127272 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.127252 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:27:28.127360 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.127302 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:27:28.127424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.127360 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:27:28.179299 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.179264 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.179449 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.179322 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-config-volume\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.179449 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.179396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-tmp-dir\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.179568 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.179453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zbvx\" (UniqueName: \"kubernetes.io/projected/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-kube-api-access-6zbvx\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.280777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.280738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-config-volume\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.281001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.280793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-tmp-dir\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.281001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.280822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zbvx\" (UniqueName: \"kubernetes.io/projected/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-kube-api-access-6zbvx\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.281001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.280903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:28.281001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.280941 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s24gg\" (UniqueName: \"kubernetes.io/projected/ff600116-8b92-45dd-8c1f-07b5c9151008-kube-api-access-s24gg\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:28.281001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.280962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.281270 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.281094 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:28.281270 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.281172 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls podName:a8f3bdc0-c9cc-4161-9c81-77828c331c3b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:28.781141897 +0000 UTC m=+34.060989669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls") pod "dns-default-n6xn7" (UID: "a8f3bdc0-c9cc-4161-9c81-77828c331c3b") : secret "dns-default-metrics-tls" not found Apr 24 21:27:28.281270 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.281231 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-tmp-dir\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.281462 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.281429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-config-volume\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.297642 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.297616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zbvx\" (UniqueName: \"kubernetes.io/projected/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-kube-api-access-6zbvx\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.363759 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.363723 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:27:28.363958 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.363772 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:27:28.366557 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.366499 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:28.366824 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.366799 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5kts7\"" Apr 24 21:27:28.367133 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.367085 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:28.367298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.367279 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hvhzf\"" Apr 24 21:27:28.367298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.367291 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:28.381609 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.381589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:28.381701 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.381616 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s24gg\" (UniqueName: \"kubernetes.io/projected/ff600116-8b92-45dd-8c1f-07b5c9151008-kube-api-access-s24gg\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:28.381753 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.381739 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:28.381827 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.381808 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert podName:ff600116-8b92-45dd-8c1f-07b5c9151008 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:28.881789409 +0000 UTC m=+34.161637170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert") pod "ingress-canary-6l98d" (UID: "ff600116-8b92-45dd-8c1f-07b5c9151008") : secret "canary-serving-cert" not found Apr 24 21:27:28.390324 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.390303 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s24gg\" (UniqueName: \"kubernetes.io/projected/ff600116-8b92-45dd-8c1f-07b5c9151008-kube-api-access-s24gg\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:28.785206 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.785149 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:28.785399 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.785316 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:28.785464 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.785414 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls podName:a8f3bdc0-c9cc-4161-9c81-77828c331c3b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.785391806 +0000 UTC m=+35.065239574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls") pod "dns-default-n6xn7" (UID: "a8f3bdc0-c9cc-4161-9c81-77828c331c3b") : secret "dns-default-metrics-tls" not found Apr 24 21:27:28.885523 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.885489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:28.885696 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.885659 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:28.885753 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.885737 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert podName:ff600116-8b92-45dd-8c1f-07b5c9151008 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.885717837 +0000 UTC m=+35.165565601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert") pod "ingress-canary-6l98d" (UID: "ff600116-8b92-45dd-8c1f-07b5c9151008") : secret "canary-serving-cert" not found Apr 24 21:27:29.792305 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.792228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:29.792742 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.792374 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:29.792742 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.792438 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls podName:a8f3bdc0-c9cc-4161-9c81-77828c331c3b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.792422848 +0000 UTC m=+37.072270613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls") pod "dns-default-n6xn7" (UID: "a8f3bdc0-c9cc-4161-9c81-77828c331c3b") : secret "dns-default-metrics-tls" not found Apr 24 21:27:29.892968 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.892933 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:29.893099 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.893067 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:29.893149 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.893138 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert podName:ff600116-8b92-45dd-8c1f-07b5c9151008 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.893120981 +0000 UTC m=+37.172968739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert") pod "ingress-canary-6l98d" (UID: "ff600116-8b92-45dd-8c1f-07b5c9151008") : secret "canary-serving-cert" not found Apr 24 21:27:30.610176 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:30.610146 2573 generic.go:358] "Generic (PLEG): container finished" podID="c8bc6b6e-12b3-4cd5-83f8-09faef4eb787" containerID="119e948df9c818cc2e46f7f821d941208263e781844a37996a99fadcb57a4ef2" exitCode=0 Apr 24 21:27:30.610337 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:30.610188 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" event={"ID":"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787","Type":"ContainerDied","Data":"119e948df9c818cc2e46f7f821d941208263e781844a37996a99fadcb57a4ef2"} Apr 24 21:27:31.614697 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:31.614662 2573 generic.go:358] "Generic (PLEG): container finished" podID="c8bc6b6e-12b3-4cd5-83f8-09faef4eb787" containerID="6cc45993ce6f2679037289a464a9ab9712e4c5a0540c47fc8652fc3af77b6d02" exitCode=0 Apr 24 21:27:31.615087 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:31.614724 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" event={"ID":"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787","Type":"ContainerDied","Data":"6cc45993ce6f2679037289a464a9ab9712e4c5a0540c47fc8652fc3af77b6d02"} Apr 24 21:27:31.807857 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:31.807825 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:31.808038 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:31.808020 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:31.808118 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:31.808103 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls podName:a8f3bdc0-c9cc-4161-9c81-77828c331c3b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:35.808080084 +0000 UTC m=+41.087927862 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls") pod "dns-default-n6xn7" (UID: "a8f3bdc0-c9cc-4161-9c81-77828c331c3b") : secret "dns-default-metrics-tls" not found Apr 24 21:27:31.908291 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:31.908170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:31.908431 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:31.908318 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:31.908431 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:31.908385 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert podName:ff600116-8b92-45dd-8c1f-07b5c9151008 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:35.908370891 +0000 UTC m=+41.188218649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert") pod "ingress-canary-6l98d" (UID: "ff600116-8b92-45dd-8c1f-07b5c9151008") : secret "canary-serving-cert" not found Apr 24 21:27:32.619217 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:32.619173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" event={"ID":"c8bc6b6e-12b3-4cd5-83f8-09faef4eb787","Type":"ContainerStarted","Data":"2fe584ae511756e9229de25a6acc94fb37a1627b38510a17ee0a063d59f23081"} Apr 24 21:27:32.643131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:32.643069 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cjtjf" podStartSLOduration=4.715792019 podStartE2EDuration="37.643054252s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:26:56.611714133 +0000 UTC m=+1.891561891" lastFinishedPulling="2026-04-24 21:27:29.538976349 +0000 UTC m=+34.818824124" observedRunningTime="2026-04-24 21:27:32.641737814 +0000 UTC m=+37.921585596" watchObservedRunningTime="2026-04-24 21:27:32.643054252 +0000 UTC m=+37.922902047" Apr 24 21:27:35.833579 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:35.833534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:35.834100 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:35.833681 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:35.834100 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:35.833742 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls podName:a8f3bdc0-c9cc-4161-9c81-77828c331c3b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:43.833726015 +0000 UTC m=+49.113573778 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls") pod "dns-default-n6xn7" (UID: "a8f3bdc0-c9cc-4161-9c81-77828c331c3b") : secret "dns-default-metrics-tls" not found Apr 24 21:27:35.933865 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:35.933831 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:35.934020 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:35.933996 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:35.934085 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:35.934066 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert podName:ff600116-8b92-45dd-8c1f-07b5c9151008 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:43.934050882 +0000 UTC m=+49.213898639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert") pod "ingress-canary-6l98d" (UID: "ff600116-8b92-45dd-8c1f-07b5c9151008") : secret "canary-serving-cert" not found Apr 24 21:27:43.892714 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:43.892671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:43.893188 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:43.892796 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:43.893188 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:43.892847 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls podName:a8f3bdc0-c9cc-4161-9c81-77828c331c3b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:59.892833769 +0000 UTC m=+65.172681526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls") pod "dns-default-n6xn7" (UID: "a8f3bdc0-c9cc-4161-9c81-77828c331c3b") : secret "dns-default-metrics-tls" not found Apr 24 21:27:43.993367 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:43.993327 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:27:43.993504 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:43.993476 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:43.993550 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:43.993539 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert podName:ff600116-8b92-45dd-8c1f-07b5c9151008 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:59.993524439 +0000 UTC m=+65.273372202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert") pod "ingress-canary-6l98d" (UID: "ff600116-8b92-45dd-8c1f-07b5c9151008") : secret "canary-serving-cert" not found Apr 24 21:27:52.603896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.603859 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz6wr" Apr 24 21:27:59.904582 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:59.904534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:27:59.905084 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:59.904682 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:59.905084 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:59.904760 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls podName:a8f3bdc0-c9cc-4161-9c81-77828c331c3b nodeName:}" failed. No retries permitted until 2026-04-24 21:28:31.904743567 +0000 UTC m=+97.184591325 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls") pod "dns-default-n6xn7" (UID: "a8f3bdc0-c9cc-4161-9c81-77828c331c3b") : secret "dns-default-metrics-tls" not found Apr 24 21:28:00.005739 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.005699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:28:00.005739 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.005738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:28:00.005991 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:00.005864 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:00.005991 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:00.005976 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert podName:ff600116-8b92-45dd-8c1f-07b5c9151008 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.005954409 +0000 UTC m=+97.285802167 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert") pod "ingress-canary-6l98d" (UID: "ff600116-8b92-45dd-8c1f-07b5c9151008") : secret "canary-serving-cert" not found Apr 24 21:28:00.008333 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.008316 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:00.016041 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:00.016020 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:00.016127 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:00.016081 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs podName:932901de-5edd-4054-b5df-89077b36dd14 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.016062744 +0000 UTC m=+129.295910521 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs") pod "network-metrics-daemon-jrhlr" (UID: "932901de-5edd-4054-b5df-89077b36dd14") : secret "metrics-daemon-secret" not found Apr 24 21:28:00.106387 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.106344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:28:00.108814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.108794 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:00.119535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.119512 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:00.144203 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.144174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpdtv\" (UniqueName: \"kubernetes.io/projected/8b12b6f6-d28f-4cda-9380-4efcca507494-kube-api-access-zpdtv\") pod \"network-check-target-lkml2\" (UID: \"8b12b6f6-d28f-4cda-9380-4efcca507494\") " pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:28:00.182257 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.182200 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hvhzf\"" Apr 24 21:28:00.189807 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.189789 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:28:00.374961 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.374911 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lkml2"] Apr 24 21:28:00.379481 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:00.379455 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b12b6f6_d28f_4cda_9380_4efcca507494.slice/crio-04a05a057109a2cdf96622325dac9440c4ef47373e7f17be75eea99a214f76b7 WatchSource:0}: Error finding container 04a05a057109a2cdf96622325dac9440c4ef47373e7f17be75eea99a214f76b7: Status 404 returned error can't find the container with id 04a05a057109a2cdf96622325dac9440c4ef47373e7f17be75eea99a214f76b7 Apr 24 21:28:00.669757 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.669717 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lkml2" event={"ID":"8b12b6f6-d28f-4cda-9380-4efcca507494","Type":"ContainerStarted","Data":"04a05a057109a2cdf96622325dac9440c4ef47373e7f17be75eea99a214f76b7"} Apr 24 21:28:03.676928 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:03.676882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lkml2" event={"ID":"8b12b6f6-d28f-4cda-9380-4efcca507494","Type":"ContainerStarted","Data":"443e35c7675d427e8136d7141a9dde2c844087f63ef39a1937be5a71360b7255"} Apr 24 21:28:03.677309 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:03.677028 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:28:03.695606 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:03.695554 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lkml2" podStartSLOduration=65.956137911 podStartE2EDuration="1m8.695538985s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:28:00.381312979 +0000 UTC m=+65.661160738" lastFinishedPulling="2026-04-24 21:28:03.120714039 +0000 UTC m=+68.400561812" observedRunningTime="2026-04-24 21:28:03.695274905 +0000 UTC m=+68.975122685" watchObservedRunningTime="2026-04-24 21:28:03.695538985 +0000 UTC m=+68.975386764" Apr 24 21:28:31.906046 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:31.906007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:28:31.906521 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:31.906153 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:31.906521 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:31.906228 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls podName:a8f3bdc0-c9cc-4161-9c81-77828c331c3b nodeName:}" failed. No retries permitted until 2026-04-24 21:29:35.906211494 +0000 UTC m=+161.186059258 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls") pod "dns-default-n6xn7" (UID: "a8f3bdc0-c9cc-4161-9c81-77828c331c3b") : secret "dns-default-metrics-tls" not found Apr 24 21:28:32.006999 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:32.006906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:28:32.007137 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:32.007058 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:32.007137 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:32.007132 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert podName:ff600116-8b92-45dd-8c1f-07b5c9151008 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:36.007114336 +0000 UTC m=+161.286962095 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert") pod "ingress-canary-6l98d" (UID: "ff600116-8b92-45dd-8c1f-07b5c9151008") : secret "canary-serving-cert" not found Apr 24 21:28:34.681032 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:34.681003 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lkml2" Apr 24 21:29:04.024129 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.024071 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:29:04.024611 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:04.024227 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:29:04.024611 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:04.024292 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs podName:932901de-5edd-4054-b5df-89077b36dd14 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:06.024277413 +0000 UTC m=+251.304125175 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs") pod "network-metrics-daemon-jrhlr" (UID: "932901de-5edd-4054-b5df-89077b36dd14") : secret "metrics-daemon-secret" not found Apr 24 21:29:11.745229 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.745195 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-69dd467946-567hg"] Apr 24 21:29:11.748012 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.747995 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.751137 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.751115 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:29:11.752030 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.752003 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:29:11.752145 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.752115 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:29:11.752215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.752161 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-8zhpq\"" Apr 24 21:29:11.752215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.752169 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:29:11.752317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.752221 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:29:11.752317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.752246 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:29:11.760808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.760789 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69dd467946-567hg"] Apr 24 21:29:11.878819 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.878791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-default-certificate\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.878819 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.878828 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.879051 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.878847 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.879051 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.878970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-stats-auth\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.879051 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.879005 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndqc9\" (UniqueName: \"kubernetes.io/projected/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-kube-api-access-ndqc9\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.914376 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.914349 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd"] Apr 24 21:29:11.917187 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.917163 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb"] Apr 24 21:29:11.917345 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.917326 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:11.919879 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.919860 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:29:11.920140 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.920126 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n"] Apr 24 21:29:11.920287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.920273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:11.920657 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.920490 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:29:11.920657 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.920494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:11.920657 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.920550 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qrrhv\"" Apr 24 21:29:11.922625 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.922607 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:11.922843 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.922829 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:11.924273 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.924254 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:11.924365 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.924275 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:11.924365 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.924344 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5qdvk\"" Apr 24 21:29:11.924695 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.924682 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:29:11.927253 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.927233 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:29:11.927253 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.927251 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:11.927405 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.927296 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ks4ct\"" Apr 24 21:29:11.927405 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.927341 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:29:11.927495 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.927407 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:11.930971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.930943 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd"] Apr 24 21:29:11.932742 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.932717 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb"] Apr 24 21:29:11.945039 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.945017 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n"] Apr 24 21:29:11.979709 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.979676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-stats-auth\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.979862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.979716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:11.979862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.979824 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8gq\" (UniqueName: \"kubernetes.io/projected/ac626c37-716a-43c4-bec8-97bf6f88f4c7-kube-api-access-5p8gq\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:11.979862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.979848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndqc9\" (UniqueName: \"kubernetes.io/projected/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-kube-api-access-ndqc9\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.980018 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.979867 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae90115-d8d4-4eb9-be87-cee9f9fded69-config\") pod \"service-ca-operator-d6fc45fc5-h7k6n\" (UID: \"fae90115-d8d4-4eb9-be87-cee9f9fded69\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:11.980018 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.979883 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbghm\" (UniqueName: \"kubernetes.io/projected/fae90115-d8d4-4eb9-be87-cee9f9fded69-kube-api-access-gbghm\") pod \"service-ca-operator-d6fc45fc5-h7k6n\" (UID: \"fae90115-d8d4-4eb9-be87-cee9f9fded69\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:11.980018 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.979903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a11ab57b-145c-4043-bbce-507e3d1017ec-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-7twhd\" (UID: \"a11ab57b-145c-4043-bbce-507e3d1017ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:11.980131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.980031 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-default-certificate\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.980131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.980104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.980234 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.980128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fae90115-d8d4-4eb9-be87-cee9f9fded69-serving-cert\") pod \"service-ca-operator-d6fc45fc5-h7k6n\" (UID: \"fae90115-d8d4-4eb9-be87-cee9f9fded69\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:11.980234 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.980150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.980234 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.980180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a11ab57b-145c-4043-bbce-507e3d1017ec-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-7twhd\" (UID: \"a11ab57b-145c-4043-bbce-507e3d1017ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:11.980433 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:11.980245 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:11.980433 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:11.980289 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.480269881 +0000 UTC m=+137.760117660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:11.980433 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:11.980340 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.480304048 +0000 UTC m=+137.760151806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : secret "router-metrics-certs-default" not found Apr 24 21:29:11.980433 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.980389 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24fmj\" (UniqueName: \"kubernetes.io/projected/a11ab57b-145c-4043-bbce-507e3d1017ec-kube-api-access-24fmj\") pod \"kube-storage-version-migrator-operator-6769c5d45-7twhd\" (UID: \"a11ab57b-145c-4043-bbce-507e3d1017ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:11.982652 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.982631 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-default-certificate\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:11.982745 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:11.982709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-stats-auth\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:12.008892 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.008676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndqc9\" (UniqueName: \"kubernetes.io/projected/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-kube-api-access-ndqc9\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:12.032694 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.032663 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99"] Apr 24 21:29:12.035767 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.035752 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:12.040095 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.040069 2573 status_manager.go:895] "Failed to get status for pod" podUID="6d3fdf4f-a1c2-4d88-9531-85052b8a2f90" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" err="pods \"cluster-monitoring-operator-75587bd455-5fl99\" is forbidden: User \"system:node:ip-10-0-136-201.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-136-201.ec2.internal' and this object" Apr 24 21:29:12.040095 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.040075 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"cluster-monitoring-operator-dockercfg-n9d5q\" is forbidden: User \"system:node:ip-10-0-136-201.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-136-201.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-n9d5q\"" type="*v1.Secret" Apr 24 21:29:12.040244 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.040080 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-136-201.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-136-201.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 24 21:29:12.040319 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.040296 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"telemetry-config\" is forbidden: User \"system:node:ip-10-0-136-201.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-136-201.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" type="*v1.ConfigMap" Apr 24 21:29:12.040797 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.040776 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"cluster-monitoring-operator-tls\" is forbidden: User \"system:node:ip-10-0-136-201.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-136-201.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" type="*v1.Secret" Apr 24 21:29:12.041691 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.041654 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-136-201.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-136-201.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 24 21:29:12.048539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.048521 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8"] Apr 24 21:29:12.051178 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.051159 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8" Apr 24 21:29:12.054776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.054760 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:12.056031 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.056014 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-8k9jb\"" Apr 24 21:29:12.060165 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.060146 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:12.072934 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.072893 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lm6jb"] Apr 24 21:29:12.075808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.075789 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.076723 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.076701 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99"] Apr 24 21:29:12.077820 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.077797 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8"] Apr 24 21:29:12.080797 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.080777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a11ab57b-145c-4043-bbce-507e3d1017ec-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-7twhd\" (UID: \"a11ab57b-145c-4043-bbce-507e3d1017ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:12.080894 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.080843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24fmj\" (UniqueName: \"kubernetes.io/projected/a11ab57b-145c-4043-bbce-507e3d1017ec-kube-api-access-24fmj\") pod \"kube-storage-version-migrator-operator-6769c5d45-7twhd\" (UID: \"a11ab57b-145c-4043-bbce-507e3d1017ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:12.080894 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.080878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw5qk\" (UniqueName: \"kubernetes.io/projected/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-kube-api-access-bw5qk\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:12.081037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.080932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:12.081037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.080961 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8gq\" (UniqueName: \"kubernetes.io/projected/ac626c37-716a-43c4-bec8-97bf6f88f4c7-kube-api-access-5p8gq\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:12.081037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.081000 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae90115-d8d4-4eb9-be87-cee9f9fded69-config\") pod \"service-ca-operator-d6fc45fc5-h7k6n\" (UID: \"fae90115-d8d4-4eb9-be87-cee9f9fded69\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:12.081037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.081025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbghm\" (UniqueName: \"kubernetes.io/projected/fae90115-d8d4-4eb9-be87-cee9f9fded69-kube-api-access-gbghm\") pod \"service-ca-operator-d6fc45fc5-h7k6n\" (UID: \"fae90115-d8d4-4eb9-be87-cee9f9fded69\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:12.081247 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.081054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:12.081247 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.081092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a11ab57b-145c-4043-bbce-507e3d1017ec-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-7twhd\" (UID: \"a11ab57b-145c-4043-bbce-507e3d1017ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:12.081247 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.081118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:12.081247 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.081210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fae90115-d8d4-4eb9-be87-cee9f9fded69-serving-cert\") pod \"service-ca-operator-d6fc45fc5-h7k6n\" (UID: \"fae90115-d8d4-4eb9-be87-cee9f9fded69\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:12.081455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.081430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a11ab57b-145c-4043-bbce-507e3d1017ec-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-7twhd\" (UID: \"a11ab57b-145c-4043-bbce-507e3d1017ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:12.081646 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.081626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae90115-d8d4-4eb9-be87-cee9f9fded69-config\") pod \"service-ca-operator-d6fc45fc5-h7k6n\" (UID: \"fae90115-d8d4-4eb9-be87-cee9f9fded69\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:12.081761 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.081746 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:12.081832 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.081821 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls podName:ac626c37-716a-43c4-bec8-97bf6f88f4c7 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.581802283 +0000 UTC m=+137.861650056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wn6zb" (UID: "ac626c37-716a-43c4-bec8-97bf6f88f4c7") : secret "samples-operator-tls" not found Apr 24 21:29:12.083826 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.083733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fae90115-d8d4-4eb9-be87-cee9f9fded69-serving-cert\") pod \"service-ca-operator-d6fc45fc5-h7k6n\" (UID: \"fae90115-d8d4-4eb9-be87-cee9f9fded69\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:12.083826 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.083752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a11ab57b-145c-4043-bbce-507e3d1017ec-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-7twhd\" (UID: \"a11ab57b-145c-4043-bbce-507e3d1017ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:12.086491 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.086470 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:29:12.086586 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.086480 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:29:12.090197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.090178 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-xnddv\"" Apr 24 21:29:12.092188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.092120 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:29:12.095392 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.095375 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:29:12.097838 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.097813 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:29:12.099179 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.099152 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lm6jb"] Apr 24 21:29:12.108743 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.108720 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8gq\" (UniqueName: \"kubernetes.io/projected/ac626c37-716a-43c4-bec8-97bf6f88f4c7-kube-api-access-5p8gq\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:12.109780 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.109759 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbghm\" (UniqueName: \"kubernetes.io/projected/fae90115-d8d4-4eb9-be87-cee9f9fded69-kube-api-access-gbghm\") pod \"service-ca-operator-d6fc45fc5-h7k6n\" (UID: \"fae90115-d8d4-4eb9-be87-cee9f9fded69\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:12.118612 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.118593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24fmj\" (UniqueName: \"kubernetes.io/projected/a11ab57b-145c-4043-bbce-507e3d1017ec-kube-api-access-24fmj\") pod \"kube-storage-version-migrator-operator-6769c5d45-7twhd\" (UID: \"a11ab57b-145c-4043-bbce-507e3d1017ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:12.146401 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.146373 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-58cd6868bf-d5l8g"] Apr 24 21:29:12.150438 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.150418 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.154482 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.154465 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:29:12.154482 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.154476 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:29:12.155223 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.155210 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:29:12.155272 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.155212 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ptvss\"" Apr 24 21:29:12.161054 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.161036 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:29:12.166231 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.166208 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58cd6868bf-d5l8g"] Apr 24 21:29:12.182134 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:12.182217 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:12.182217 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182167 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ca8074-c925-4c71-a52a-9bdc355c56df-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.182217 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73ca8074-c925-4c71-a52a-9bdc355c56df-serving-cert\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.182325 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182229 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smhzv\" (UniqueName: \"kubernetes.io/projected/73ca8074-c925-4c71-a52a-9bdc355c56df-kube-api-access-smhzv\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.182325 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4rn\" (UniqueName: \"kubernetes.io/projected/4c0b984d-cadd-403a-96c1-59b9b9d27f65-kube-api-access-pt4rn\") pod \"volume-data-source-validator-7c6cbb6c87-vdbs8\" (UID: \"4c0b984d-cadd-403a-96c1-59b9b9d27f65\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8" Apr 24 21:29:12.182393 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182329 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73ca8074-c925-4c71-a52a-9bdc355c56df-tmp\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.182393 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/73ca8074-c925-4c71-a52a-9bdc355c56df-snapshots\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.182393 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182368 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ca8074-c925-4c71-a52a-9bdc355c56df-service-ca-bundle\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.182495 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.182448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw5qk\" (UniqueName: \"kubernetes.io/projected/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-kube-api-access-bw5qk\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:12.226788 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.226756 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" Apr 24 21:29:12.241009 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.240988 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" Apr 24 21:29:12.283558 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.283522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73ca8074-c925-4c71-a52a-9bdc355c56df-serving-cert\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.283685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.283579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4rn\" (UniqueName: \"kubernetes.io/projected/4c0b984d-cadd-403a-96c1-59b9b9d27f65-kube-api-access-pt4rn\") pod \"volume-data-source-validator-7c6cbb6c87-vdbs8\" (UID: \"4c0b984d-cadd-403a-96c1-59b9b9d27f65\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8" Apr 24 21:29:12.283685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.283610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-trusted-ca\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.283685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.283635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/73ca8074-c925-4c71-a52a-9bdc355c56df-snapshots\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.283685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.283660 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-image-registry-private-configuration\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.283860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.283738 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/147a51c4-da64-463a-bc56-329ff8627c5d-ca-trust-extracted\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.283860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.283772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.283860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.283805 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ca8074-c925-4c71-a52a-9bdc355c56df-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ca8074-c925-4c71-a52a-9bdc355c56df-service-ca-bundle\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73ca8074-c925-4c71-a52a-9bdc355c56df-tmp\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284250 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-registry-certificates\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284277 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9zb4\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-kube-api-access-f9zb4\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smhzv\" (UniqueName: \"kubernetes.io/projected/73ca8074-c925-4c71-a52a-9bdc355c56df-kube-api-access-smhzv\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284370 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-bound-sa-token\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-installation-pull-secrets\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73ca8074-c925-4c71-a52a-9bdc355c56df-tmp\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ca8074-c925-4c71-a52a-9bdc355c56df-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.284731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.284701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ca8074-c925-4c71-a52a-9bdc355c56df-service-ca-bundle\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.285261 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.285218 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/73ca8074-c925-4c71-a52a-9bdc355c56df-snapshots\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.287281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.287252 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73ca8074-c925-4c71-a52a-9bdc355c56df-serving-cert\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.296164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.295015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smhzv\" (UniqueName: \"kubernetes.io/projected/73ca8074-c925-4c71-a52a-9bdc355c56df-kube-api-access-smhzv\") pod \"insights-operator-585dfdc468-lm6jb\" (UID: \"73ca8074-c925-4c71-a52a-9bdc355c56df\") " pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.296298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.296216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4rn\" (UniqueName: \"kubernetes.io/projected/4c0b984d-cadd-403a-96c1-59b9b9d27f65-kube-api-access-pt4rn\") pod \"volume-data-source-validator-7c6cbb6c87-vdbs8\" (UID: \"4c0b984d-cadd-403a-96c1-59b9b9d27f65\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8" Apr 24 21:29:12.355826 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.355795 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd"] Apr 24 21:29:12.358519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.358494 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8" Apr 24 21:29:12.358963 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:12.358938 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda11ab57b_145c_4043_bbce_507e3d1017ec.slice/crio-61f28697061e300da4468c34a205b47c499aefc4edc5079232907c22ef5f7a71 WatchSource:0}: Error finding container 61f28697061e300da4468c34a205b47c499aefc4edc5079232907c22ef5f7a71: Status 404 returned error can't find the container with id 61f28697061e300da4468c34a205b47c499aefc4edc5079232907c22ef5f7a71 Apr 24 21:29:12.370964 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.370937 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n"] Apr 24 21:29:12.373461 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:12.373424 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae90115_d8d4_4eb9_be87_cee9f9fded69.slice/crio-8d724b832893d1aff66a331605060635a6f763fd1e01ee06f29f3d4d901a2eaa WatchSource:0}: Error finding container 8d724b832893d1aff66a331605060635a6f763fd1e01ee06f29f3d4d901a2eaa: Status 404 returned error can't find the container with id 8d724b832893d1aff66a331605060635a6f763fd1e01ee06f29f3d4d901a2eaa Apr 24 21:29:12.384783 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.384759 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lm6jb" Apr 24 21:29:12.384973 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.384952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-trusted-ca\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.385017 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.384984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-image-registry-private-configuration\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.385073 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.385027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/147a51c4-da64-463a-bc56-329ff8627c5d-ca-trust-extracted\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.385073 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.385048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.385161 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.385103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-registry-certificates\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.385161 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.385125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9zb4\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-kube-api-access-f9zb4\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.385252 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.385161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-bound-sa-token\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.385252 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.385195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-installation-pull-secrets\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.385252 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.385235 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:12.385484 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.385257 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58cd6868bf-d5l8g: secret "image-registry-tls" not found Apr 24 21:29:12.385484 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.385329 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls podName:147a51c4-da64-463a-bc56-329ff8627c5d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.885307135 +0000 UTC m=+138.165154920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls") pod "image-registry-58cd6868bf-d5l8g" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d") : secret "image-registry-tls" not found Apr 24 21:29:12.385484 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.385385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/147a51c4-da64-463a-bc56-329ff8627c5d-ca-trust-extracted\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.386049 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.386024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-registry-certificates\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.387405 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.387377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-trusted-ca\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.387707 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.387686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-image-registry-private-configuration\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.388343 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.388326 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-installation-pull-secrets\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.400807 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.400779 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9zb4\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-kube-api-access-f9zb4\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.401804 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.401784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-bound-sa-token\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.486408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.486380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:12.486580 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.486438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:12.486580 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.486555 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:13.486536952 +0000 UTC m=+138.766384710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:12.486698 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.486586 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:12.486698 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.486647 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:13.486629554 +0000 UTC m=+138.766477329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : secret "router-metrics-certs-default" not found Apr 24 21:29:12.489855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.489826 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8"] Apr 24 21:29:12.493879 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:12.493857 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0b984d_cadd_403a_96c1_59b9b9d27f65.slice/crio-11578ee7431aebe98a521812be7ae4c7817f31913ee6b288ce1cc1a8f9761807 WatchSource:0}: Error finding container 11578ee7431aebe98a521812be7ae4c7817f31913ee6b288ce1cc1a8f9761807: Status 404 returned error can't find the container with id 11578ee7431aebe98a521812be7ae4c7817f31913ee6b288ce1cc1a8f9761807 Apr 24 21:29:12.507284 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.507260 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lm6jb"] Apr 24 21:29:12.509495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:12.509473 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ca8074_c925_4c71_a52a_9bdc355c56df.slice/crio-91f54268043b5d4fd3d74e7ba4740a13de6acb8ddd1d9f8cbf315b8a9972dfb2 WatchSource:0}: Error finding container 91f54268043b5d4fd3d74e7ba4740a13de6acb8ddd1d9f8cbf315b8a9972dfb2: Status 404 returned error can't find the container with id 91f54268043b5d4fd3d74e7ba4740a13de6acb8ddd1d9f8cbf315b8a9972dfb2 Apr 24 21:29:12.587187 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.587105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:12.587313 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.587233 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:12.587313 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.587290 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls podName:ac626c37-716a-43c4-bec8-97bf6f88f4c7 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:13.587273516 +0000 UTC m=+138.867121278 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wn6zb" (UID: "ac626c37-716a-43c4-bec8-97bf6f88f4c7") : secret "samples-operator-tls" not found Apr 24 21:29:12.798935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.798878 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8" event={"ID":"4c0b984d-cadd-403a-96c1-59b9b9d27f65","Type":"ContainerStarted","Data":"11578ee7431aebe98a521812be7ae4c7817f31913ee6b288ce1cc1a8f9761807"} Apr 24 21:29:12.799798 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.799777 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" event={"ID":"fae90115-d8d4-4eb9-be87-cee9f9fded69","Type":"ContainerStarted","Data":"8d724b832893d1aff66a331605060635a6f763fd1e01ee06f29f3d4d901a2eaa"} Apr 24 21:29:12.800608 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.800589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" event={"ID":"a11ab57b-145c-4043-bbce-507e3d1017ec","Type":"ContainerStarted","Data":"61f28697061e300da4468c34a205b47c499aefc4edc5079232907c22ef5f7a71"} Apr 24 21:29:12.801428 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.801407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lm6jb" event={"ID":"73ca8074-c925-4c71-a52a-9bdc355c56df","Type":"ContainerStarted","Data":"91f54268043b5d4fd3d74e7ba4740a13de6acb8ddd1d9f8cbf315b8a9972dfb2"} Apr 24 21:29:12.877965 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.877871 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:29:12.889766 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.889742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:12.889904 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.889884 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:12.889959 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.889907 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58cd6868bf-d5l8g: secret "image-registry-tls" not found Apr 24 21:29:12.889990 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:12.889969 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls podName:147a51c4-da64-463a-bc56-329ff8627c5d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:13.889953797 +0000 UTC m=+139.169801557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls") pod "image-registry-58cd6868bf-d5l8g" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d") : secret "image-registry-tls" not found Apr 24 21:29:12.984193 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:12.984162 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-n9d5q\"" Apr 24 21:29:13.045757 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.045725 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:29:13.053111 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.053065 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:13.053267 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.053173 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls podName:6d3fdf4f-a1c2-4d88-9531-85052b8a2f90 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:13.553154534 +0000 UTC m=+138.833002294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5fl99" (UID: "6d3fdf4f-a1c2-4d88-9531-85052b8a2f90") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:13.183064 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.182983 2573 configmap.go:193] Couldn't get configMap openshift-monitoring/telemetry-config: failed to sync configmap cache: timed out waiting for the condition Apr 24 21:29:13.183226 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.183086 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-telemetry-config podName:6d3fdf4f-a1c2-4d88-9531-85052b8a2f90 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:13.683064065 +0000 UTC m=+138.962911838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "telemetry-config" (UniqueName: "kubernetes.io/configmap/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-telemetry-config") pod "cluster-monitoring-operator-75587bd455-5fl99" (UID: "6d3fdf4f-a1c2-4d88-9531-85052b8a2f90") : failed to sync configmap cache: timed out waiting for the condition Apr 24 21:29:13.390583 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.390547 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:29:13.495625 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.495546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:13.495625 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.495597 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:13.495896 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.495874 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:13.495984 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.495967 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:15.495946712 +0000 UTC m=+140.775794477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : secret "router-metrics-certs-default" not found Apr 24 21:29:13.496407 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.496390 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:15.496373062 +0000 UTC m=+140.776220824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:13.596614 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.596572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:13.596800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.596645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:13.596800 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.596741 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:13.596904 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.596811 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:13.596904 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.596826 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls podName:ac626c37-716a-43c4-bec8-97bf6f88f4c7 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:15.596804227 +0000 UTC m=+140.876651996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wn6zb" (UID: "ac626c37-716a-43c4-bec8-97bf6f88f4c7") : secret "samples-operator-tls" not found Apr 24 21:29:13.596904 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.596876 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls podName:6d3fdf4f-a1c2-4d88-9531-85052b8a2f90 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:14.596856982 +0000 UTC m=+139.876704745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5fl99" (UID: "6d3fdf4f-a1c2-4d88-9531-85052b8a2f90") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:13.606094 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.606067 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:29:13.615346 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.615286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw5qk\" (UniqueName: \"kubernetes.io/projected/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-kube-api-access-bw5qk\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:13.697343 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.697294 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:13.698342 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.698314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:13.899452 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:13.899409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:13.899938 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.899737 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:13.899938 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.899756 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58cd6868bf-d5l8g: secret "image-registry-tls" not found Apr 24 21:29:13.899938 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:13.899816 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls podName:147a51c4-da64-463a-bc56-329ff8627c5d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:15.899797494 +0000 UTC m=+141.179645255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls") pod "image-registry-58cd6868bf-d5l8g" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d") : secret "image-registry-tls" not found Apr 24 21:29:14.606939 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:14.606888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:14.607094 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:14.607047 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:14.607145 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:14.607119 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls podName:6d3fdf4f-a1c2-4d88-9531-85052b8a2f90 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:16.6071033 +0000 UTC m=+141.886951063 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5fl99" (UID: "6d3fdf4f-a1c2-4d88-9531-85052b8a2f90") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:15.515472 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.515432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:15.515472 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.515480 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:15.516020 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:15.515640 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:19.515613642 +0000 UTC m=+144.795461414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:15.516020 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:15.515692 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:15.516020 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:15.515763 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:19.51574524 +0000 UTC m=+144.795593003 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : secret "router-metrics-certs-default" not found Apr 24 21:29:15.616253 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.616229 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:15.616361 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:15.616328 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:15.616397 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:15.616371 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls podName:ac626c37-716a-43c4-bec8-97bf6f88f4c7 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:19.616358368 +0000 UTC m=+144.896206125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wn6zb" (UID: "ac626c37-716a-43c4-bec8-97bf6f88f4c7") : secret "samples-operator-tls" not found Apr 24 21:29:15.809475 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.809392 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8" event={"ID":"4c0b984d-cadd-403a-96c1-59b9b9d27f65","Type":"ContainerStarted","Data":"aa4f2e5fcd3ffba53cd6343f74cb79f8ef30161b0086345e7b76c69f79fd89f1"} Apr 24 21:29:15.810864 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.810835 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" event={"ID":"fae90115-d8d4-4eb9-be87-cee9f9fded69","Type":"ContainerStarted","Data":"475ada3e97fcba48e05999940a5a0c5bc57242da5977279596d60070554e3cc4"} Apr 24 21:29:15.812189 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.812152 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" event={"ID":"a11ab57b-145c-4043-bbce-507e3d1017ec","Type":"ContainerStarted","Data":"190194cc9405c0ed57495edbb859d80201e19f508cabe049b891a3b447c8a0cb"} Apr 24 21:29:15.813398 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.813373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lm6jb" event={"ID":"73ca8074-c925-4c71-a52a-9bdc355c56df","Type":"ContainerStarted","Data":"697e6540bcd223a77864e4569f6bc2a75065b8fb89dd2f648f770eb8a4528605"} Apr 24 21:29:15.828644 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.828532 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vdbs8" podStartSLOduration=0.717004415 podStartE2EDuration="3.828517264s" podCreationTimestamp="2026-04-24 21:29:12 +0000 UTC" firstStartedPulling="2026-04-24 21:29:12.495672812 +0000 UTC m=+137.775520570" lastFinishedPulling="2026-04-24 21:29:15.607185662 +0000 UTC m=+140.887033419" observedRunningTime="2026-04-24 21:29:15.828512089 +0000 UTC m=+141.108359873" watchObservedRunningTime="2026-04-24 21:29:15.828517264 +0000 UTC m=+141.108365048" Apr 24 21:29:15.844684 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.844631 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-lm6jb" podStartSLOduration=0.746392682 podStartE2EDuration="3.844614683s" podCreationTimestamp="2026-04-24 21:29:12 +0000 UTC" firstStartedPulling="2026-04-24 21:29:12.511216995 +0000 UTC m=+137.791064752" lastFinishedPulling="2026-04-24 21:29:15.609438994 +0000 UTC m=+140.889286753" observedRunningTime="2026-04-24 21:29:15.843783234 +0000 UTC m=+141.123631025" watchObservedRunningTime="2026-04-24 21:29:15.844614683 +0000 UTC m=+141.124462464" Apr 24 21:29:15.860759 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.860707 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" podStartSLOduration=1.616827986 podStartE2EDuration="4.860694149s" podCreationTimestamp="2026-04-24 21:29:11 +0000 UTC" firstStartedPulling="2026-04-24 21:29:12.363449864 +0000 UTC m=+137.643297639" lastFinishedPulling="2026-04-24 21:29:15.607316027 +0000 UTC m=+140.887163802" observedRunningTime="2026-04-24 21:29:15.860029379 +0000 UTC m=+141.139877157" watchObservedRunningTime="2026-04-24 21:29:15.860694149 +0000 UTC m=+141.140541931" Apr 24 21:29:15.883779 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.883725 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" podStartSLOduration=1.645812584 podStartE2EDuration="4.88370765s" podCreationTimestamp="2026-04-24 21:29:11 +0000 UTC" firstStartedPulling="2026-04-24 21:29:12.37510912 +0000 UTC m=+137.654956878" lastFinishedPulling="2026-04-24 21:29:15.613004186 +0000 UTC m=+140.892851944" observedRunningTime="2026-04-24 21:29:15.88265189 +0000 UTC m=+141.162499671" watchObservedRunningTime="2026-04-24 21:29:15.88370765 +0000 UTC m=+141.163555432" Apr 24 21:29:15.919289 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:15.919260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:15.920026 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:15.920008 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:15.920026 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:15.920026 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58cd6868bf-d5l8g: secret "image-registry-tls" not found Apr 24 21:29:15.920160 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:15.920091 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls podName:147a51c4-da64-463a-bc56-329ff8627c5d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:19.920067495 +0000 UTC m=+145.199915267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls") pod "image-registry-58cd6868bf-d5l8g" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d") : secret "image-registry-tls" not found Apr 24 21:29:16.624753 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:16.624711 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:16.625264 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:16.624885 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:16.625264 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:16.624980 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls podName:6d3fdf4f-a1c2-4d88-9531-85052b8a2f90 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:20.624959755 +0000 UTC m=+145.904807515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5fl99" (UID: "6d3fdf4f-a1c2-4d88-9531-85052b8a2f90") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:18.999413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:18.999383 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zvvmj_fc418569-4514-4b49-bd55-839ecdb097d5/dns-node-resolver/0.log" Apr 24 21:29:19.256101 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.256025 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-k54sh"] Apr 24 21:29:19.260282 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.260264 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.263939 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.263902 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:29:19.264656 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.264639 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:29:19.264822 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.264806 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:29:19.264865 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.264808 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bbntw\"" Apr 24 21:29:19.266729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.266536 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:29:19.274126 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.274101 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-k54sh"] Apr 24 21:29:19.348558 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.348528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/001792dd-53cc-42b6-9622-0aa03a562e83-signing-key\") pod \"service-ca-865cb79987-k54sh\" (UID: \"001792dd-53cc-42b6-9622-0aa03a562e83\") " pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.348709 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.348574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/001792dd-53cc-42b6-9622-0aa03a562e83-signing-cabundle\") pod \"service-ca-865cb79987-k54sh\" (UID: \"001792dd-53cc-42b6-9622-0aa03a562e83\") " pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.348745 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.348706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57q8p\" (UniqueName: \"kubernetes.io/projected/001792dd-53cc-42b6-9622-0aa03a562e83-kube-api-access-57q8p\") pod \"service-ca-865cb79987-k54sh\" (UID: \"001792dd-53cc-42b6-9622-0aa03a562e83\") " pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.449539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.449507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57q8p\" (UniqueName: \"kubernetes.io/projected/001792dd-53cc-42b6-9622-0aa03a562e83-kube-api-access-57q8p\") pod \"service-ca-865cb79987-k54sh\" (UID: \"001792dd-53cc-42b6-9622-0aa03a562e83\") " pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.449690 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.449570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/001792dd-53cc-42b6-9622-0aa03a562e83-signing-key\") pod \"service-ca-865cb79987-k54sh\" (UID: \"001792dd-53cc-42b6-9622-0aa03a562e83\") " pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.449749 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.449716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/001792dd-53cc-42b6-9622-0aa03a562e83-signing-cabundle\") pod \"service-ca-865cb79987-k54sh\" (UID: \"001792dd-53cc-42b6-9622-0aa03a562e83\") " pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.450397 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.450355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/001792dd-53cc-42b6-9622-0aa03a562e83-signing-cabundle\") pod \"service-ca-865cb79987-k54sh\" (UID: \"001792dd-53cc-42b6-9622-0aa03a562e83\") " pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.451961 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.451939 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/001792dd-53cc-42b6-9622-0aa03a562e83-signing-key\") pod \"service-ca-865cb79987-k54sh\" (UID: \"001792dd-53cc-42b6-9622-0aa03a562e83\") " pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.460696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.460670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57q8p\" (UniqueName: \"kubernetes.io/projected/001792dd-53cc-42b6-9622-0aa03a562e83-kube-api-access-57q8p\") pod \"service-ca-865cb79987-k54sh\" (UID: \"001792dd-53cc-42b6-9622-0aa03a562e83\") " pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.550668 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.550593 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:19.550668 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.550628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:19.550806 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:19.550753 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:19.550806 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:19.550790 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:27.550771524 +0000 UTC m=+152.830619298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:19.550880 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:19.550816 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs podName:ed5b5e5f-f500-4262-a6bb-2772c51e47b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:27.550807434 +0000 UTC m=+152.830655191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs") pod "router-default-69dd467946-567hg" (UID: "ed5b5e5f-f500-4262-a6bb-2772c51e47b0") : secret "router-metrics-certs-default" not found Apr 24 21:29:19.569037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.569014 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-k54sh" Apr 24 21:29:19.651512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.651478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:19.651674 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:19.651652 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:19.651747 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:19.651736 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls podName:ac626c37-716a-43c4-bec8-97bf6f88f4c7 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:27.651716097 +0000 UTC m=+152.931563859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wn6zb" (UID: "ac626c37-716a-43c4-bec8-97bf6f88f4c7") : secret "samples-operator-tls" not found Apr 24 21:29:19.699729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.699694 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-k54sh"] Apr 24 21:29:19.702758 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:19.702728 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod001792dd_53cc_42b6_9622_0aa03a562e83.slice/crio-500284f39479d4c4cf32d560e5cdc488cb5168c20033b7011f79191796aded80 WatchSource:0}: Error finding container 500284f39479d4c4cf32d560e5cdc488cb5168c20033b7011f79191796aded80: Status 404 returned error can't find the container with id 500284f39479d4c4cf32d560e5cdc488cb5168c20033b7011f79191796aded80 Apr 24 21:29:19.808542 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.808465 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q9bbt_313a846d-a4f1-459e-b416-a695b875548d/node-ca/0.log" Apr 24 21:29:19.829736 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.829701 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-k54sh" event={"ID":"001792dd-53cc-42b6-9622-0aa03a562e83","Type":"ContainerStarted","Data":"427fa638eb798f0619c92883b7952ebcd64f0a991f8d8b16b065b585e22f82cf"} Apr 24 21:29:19.829736 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.829737 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-k54sh" event={"ID":"001792dd-53cc-42b6-9622-0aa03a562e83","Type":"ContainerStarted","Data":"500284f39479d4c4cf32d560e5cdc488cb5168c20033b7011f79191796aded80"} Apr 24 21:29:19.856837 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.856784 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-k54sh" podStartSLOduration=0.85677185 podStartE2EDuration="856.77185ms" podCreationTimestamp="2026-04-24 21:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:19.855784357 +0000 UTC m=+145.135632149" watchObservedRunningTime="2026-04-24 21:29:19.85677185 +0000 UTC m=+145.136619631" Apr 24 21:29:19.953386 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:19.953348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:19.953556 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:19.953444 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:19.953556 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:19.953459 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58cd6868bf-d5l8g: secret "image-registry-tls" not found Apr 24 21:29:19.953556 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:19.953503 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls podName:147a51c4-da64-463a-bc56-329ff8627c5d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:27.95349007 +0000 UTC m=+153.233337827 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls") pod "image-registry-58cd6868bf-d5l8g" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d") : secret "image-registry-tls" not found Apr 24 21:29:20.659578 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:20.659538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:20.660063 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:20.659713 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:20.660063 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:20.659801 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls podName:6d3fdf4f-a1c2-4d88-9531-85052b8a2f90 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:28.659780758 +0000 UTC m=+153.939628530 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5fl99" (UID: "6d3fdf4f-a1c2-4d88-9531-85052b8a2f90") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:27.617769 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.617718 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:27.617769 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.617772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:27.618350 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.618329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-service-ca-bundle\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:27.620018 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.620000 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed5b5e5f-f500-4262-a6bb-2772c51e47b0-metrics-certs\") pod \"router-default-69dd467946-567hg\" (UID: \"ed5b5e5f-f500-4262-a6bb-2772c51e47b0\") " pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:27.657249 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.657217 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:27.719198 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.718982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:27.723358 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.723332 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac626c37-716a-43c4-bec8-97bf6f88f4c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wn6zb\" (UID: \"ac626c37-716a-43c4-bec8-97bf6f88f4c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:27.810718 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.810694 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69dd467946-567hg"] Apr 24 21:29:27.813842 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:27.813812 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded5b5e5f_f500_4262_a6bb_2772c51e47b0.slice/crio-2d5fddaeae82b75a68c8f9f895bf481bffada1355f6096e929f83f00e185bc2d WatchSource:0}: Error finding container 2d5fddaeae82b75a68c8f9f895bf481bffada1355f6096e929f83f00e185bc2d: Status 404 returned error can't find the container with id 2d5fddaeae82b75a68c8f9f895bf481bffada1355f6096e929f83f00e185bc2d Apr 24 21:29:27.836901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.836879 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" Apr 24 21:29:27.850690 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.850658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69dd467946-567hg" event={"ID":"ed5b5e5f-f500-4262-a6bb-2772c51e47b0","Type":"ContainerStarted","Data":"2d5fddaeae82b75a68c8f9f895bf481bffada1355f6096e929f83f00e185bc2d"} Apr 24 21:29:27.965791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.965758 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb"] Apr 24 21:29:28.022769 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.022737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:28.024891 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.024871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") pod \"image-registry-58cd6868bf-d5l8g\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:28.059989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.059960 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:28.176806 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.176775 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58cd6868bf-d5l8g"] Apr 24 21:29:28.179867 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:28.179841 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod147a51c4_da64_463a_bc56_329ff8627c5d.slice/crio-8ec38194914b308c91f9757f22aa12de7b1a5a3ef83f1294becc90a84b440ade WatchSource:0}: Error finding container 8ec38194914b308c91f9757f22aa12de7b1a5a3ef83f1294becc90a84b440ade: Status 404 returned error can't find the container with id 8ec38194914b308c91f9757f22aa12de7b1a5a3ef83f1294becc90a84b440ade Apr 24 21:29:28.730235 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.730192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:28.733027 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.733002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d3fdf4f-a1c2-4d88-9531-85052b8a2f90-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5fl99\" (UID: \"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:28.844886 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.844841 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" Apr 24 21:29:28.855692 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.855640 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69dd467946-567hg" event={"ID":"ed5b5e5f-f500-4262-a6bb-2772c51e47b0","Type":"ContainerStarted","Data":"c72497eff1e5a0a7a6d0198fb86a4467efffbbb6341dcb5e5ccda44395516f17"} Apr 24 21:29:28.860789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.860760 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" event={"ID":"147a51c4-da64-463a-bc56-329ff8627c5d","Type":"ContainerStarted","Data":"81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c"} Apr 24 21:29:28.860938 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.860797 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" event={"ID":"147a51c4-da64-463a-bc56-329ff8627c5d","Type":"ContainerStarted","Data":"8ec38194914b308c91f9757f22aa12de7b1a5a3ef83f1294becc90a84b440ade"} Apr 24 21:29:28.861359 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.861338 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:28.862624 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.862590 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" event={"ID":"ac626c37-716a-43c4-bec8-97bf6f88f4c7","Type":"ContainerStarted","Data":"0dd75b48a26be30b8be4fa6d3645097ee5cf3ac68b4fb1dc170bcd5bada35dbb"} Apr 24 21:29:28.878455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.878403 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-69dd467946-567hg" podStartSLOduration=17.878385008 podStartE2EDuration="17.878385008s" podCreationTimestamp="2026-04-24 21:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:28.876160775 +0000 UTC m=+154.156008556" watchObservedRunningTime="2026-04-24 21:29:28.878385008 +0000 UTC m=+154.158232789" Apr 24 21:29:28.896058 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.895894 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" podStartSLOduration=16.895874681 podStartE2EDuration="16.895874681s" podCreationTimestamp="2026-04-24 21:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:28.895402201 +0000 UTC m=+154.175249984" watchObservedRunningTime="2026-04-24 21:29:28.895874681 +0000 UTC m=+154.175722463" Apr 24 21:29:28.986570 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.986486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99"] Apr 24 21:29:28.989997 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:28.989899 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d3fdf4f_a1c2_4d88_9531_85052b8a2f90.slice/crio-3f817721eab4ec91212640a5fd4db22b9bb313d3865671e39c3f6c5e1c051537 WatchSource:0}: Error finding container 3f817721eab4ec91212640a5fd4db22b9bb313d3865671e39c3f6c5e1c051537: Status 404 returned error can't find the container with id 3f817721eab4ec91212640a5fd4db22b9bb313d3865671e39c3f6c5e1c051537 Apr 24 21:29:29.657662 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:29.657634 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:29.660411 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:29.660389 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:29.869601 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:29.869512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" event={"ID":"ac626c37-716a-43c4-bec8-97bf6f88f4c7","Type":"ContainerStarted","Data":"df23f8a078b172de0893c94a0e93627b7cb9481843f85acd841488864593e6ce"} Apr 24 21:29:29.869601 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:29.869556 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" event={"ID":"ac626c37-716a-43c4-bec8-97bf6f88f4c7","Type":"ContainerStarted","Data":"1386dffda58f3f24f6cf53b5c36737f9e4e8fb07e362f5069e85bafe7dd336bf"} Apr 24 21:29:29.871363 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:29.871320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" event={"ID":"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90","Type":"ContainerStarted","Data":"3f817721eab4ec91212640a5fd4db22b9bb313d3865671e39c3f6c5e1c051537"} Apr 24 21:29:29.872061 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:29.871610 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:29.873224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:29.873195 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-69dd467946-567hg" Apr 24 21:29:29.891834 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:29.891784 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wn6zb" podStartSLOduration=17.284826128 podStartE2EDuration="18.891766933s" podCreationTimestamp="2026-04-24 21:29:11 +0000 UTC" firstStartedPulling="2026-04-24 21:29:28.006043982 +0000 UTC m=+153.285891739" lastFinishedPulling="2026-04-24 21:29:29.612984773 +0000 UTC m=+154.892832544" observedRunningTime="2026-04-24 21:29:29.890532456 +0000 UTC m=+155.170380238" watchObservedRunningTime="2026-04-24 21:29:29.891766933 +0000 UTC m=+155.171614715" Apr 24 21:29:30.874881 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:30.874845 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" event={"ID":"6d3fdf4f-a1c2-4d88-9531-85052b8a2f90","Type":"ContainerStarted","Data":"370f3145cac75adab199573f0e95b386657a5cceca8d8e08ea9ed9003b6f06e5"} Apr 24 21:29:30.895186 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:30.895138 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5fl99" podStartSLOduration=17.126791641 podStartE2EDuration="18.895123233s" podCreationTimestamp="2026-04-24 21:29:12 +0000 UTC" firstStartedPulling="2026-04-24 21:29:28.991806009 +0000 UTC m=+154.271653766" lastFinishedPulling="2026-04-24 21:29:30.760137596 +0000 UTC m=+156.039985358" observedRunningTime="2026-04-24 21:29:30.894037883 +0000 UTC m=+156.173885664" watchObservedRunningTime="2026-04-24 21:29:30.895123233 +0000 UTC m=+156.174971013" Apr 24 21:29:31.120314 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:31.120211 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-n6xn7" podUID="a8f3bdc0-c9cc-4161-9c81-77828c331c3b" Apr 24 21:29:31.134340 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:31.134307 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6l98d" podUID="ff600116-8b92-45dd-8c1f-07b5c9151008" Apr 24 21:29:31.375085 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:31.375001 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jrhlr" podUID="932901de-5edd-4054-b5df-89077b36dd14" Apr 24 21:29:31.877760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:31.877715 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n6xn7" Apr 24 21:29:35.992336 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:35.992295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:29:35.994635 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:35.994613 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8f3bdc0-c9cc-4161-9c81-77828c331c3b-metrics-tls\") pod \"dns-default-n6xn7\" (UID: \"a8f3bdc0-c9cc-4161-9c81-77828c331c3b\") " pod="openshift-dns/dns-default-n6xn7" Apr 24 21:29:36.081229 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:36.081198 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mzt4d\"" Apr 24 21:29:36.089029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:36.089007 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n6xn7" Apr 24 21:29:36.092883 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:36.092861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:29:36.095220 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:36.095193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff600116-8b92-45dd-8c1f-07b5c9151008-cert\") pod \"ingress-canary-6l98d\" (UID: \"ff600116-8b92-45dd-8c1f-07b5c9151008\") " pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:29:36.231470 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:36.231410 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n6xn7"] Apr 24 21:29:36.233523 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:36.233481 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8f3bdc0_c9cc_4161_9c81_77828c331c3b.slice/crio-f701f931ee6e7c23ea220415c467b351de9f29f05f9e1d91969210cb295ac634 WatchSource:0}: Error finding container f701f931ee6e7c23ea220415c467b351de9f29f05f9e1d91969210cb295ac634: Status 404 returned error can't find the container with id f701f931ee6e7c23ea220415c467b351de9f29f05f9e1d91969210cb295ac634 Apr 24 21:29:36.890393 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:36.890351 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n6xn7" event={"ID":"a8f3bdc0-c9cc-4161-9c81-77828c331c3b","Type":"ContainerStarted","Data":"f701f931ee6e7c23ea220415c467b351de9f29f05f9e1d91969210cb295ac634"} Apr 24 21:29:37.898145 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:37.898052 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n6xn7" event={"ID":"a8f3bdc0-c9cc-4161-9c81-77828c331c3b","Type":"ContainerStarted","Data":"650b89d29c1b048f1be673d0a75b5a0fff4591300588dec1c0f650ab1867a04a"} Apr 24 21:29:37.898145 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:37.898090 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n6xn7" event={"ID":"a8f3bdc0-c9cc-4161-9c81-77828c331c3b","Type":"ContainerStarted","Data":"8f5fa2264c4d8c6115b6b907615470644c2c18be40ecce4260476ff5336fa8a7"} Apr 24 21:29:37.898619 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:37.898208 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-n6xn7" Apr 24 21:29:37.916153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:37.916095 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-n6xn7" podStartSLOduration=128.625870949 podStartE2EDuration="2m9.916079271s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:29:36.235440385 +0000 UTC m=+161.515288143" lastFinishedPulling="2026-04-24 21:29:37.525648696 +0000 UTC m=+162.805496465" observedRunningTime="2026-04-24 21:29:37.915937931 +0000 UTC m=+163.195785710" watchObservedRunningTime="2026-04-24 21:29:37.916079271 +0000 UTC m=+163.195927052" Apr 24 21:29:40.608033 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.607999 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl"] Apr 24 21:29:40.611069 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.611050 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" Apr 24 21:29:40.611615 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.611594 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xwrsb"] Apr 24 21:29:40.614286 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.614266 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:29:40.614716 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.614701 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-sbrdj\"" Apr 24 21:29:40.614888 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.614871 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.617802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.617764 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:29:40.617802 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:40.617776 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"insights-runtime-extractor-tls\" is forbidden: User \"system:node:ip-10-0-136-201.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-136-201.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" type="*v1.Secret" Apr 24 21:29:40.617984 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.617861 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b67gj\"" Apr 24 21:29:40.626592 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.626575 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-data-volume\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.626686 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.626600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j86fl\" (UniqueName: \"kubernetes.io/projected/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-kube-api-access-j86fl\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.626686 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.626638 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6cab315a-9130-4a94-88ff-6ef8e5291d77-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ztbnl\" (UID: \"6cab315a-9130-4a94-88ff-6ef8e5291d77\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" Apr 24 21:29:40.626785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.626707 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.626785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.626758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.626785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.626779 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-crio-socket\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.636564 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.636535 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl"] Apr 24 21:29:40.652759 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.652732 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-58cd6868bf-d5l8g"] Apr 24 21:29:40.654487 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.654457 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xwrsb"] Apr 24 21:29:40.727558 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.727529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-data-volume\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.727558 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.727559 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j86fl\" (UniqueName: \"kubernetes.io/projected/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-kube-api-access-j86fl\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.727773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.727583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6cab315a-9130-4a94-88ff-6ef8e5291d77-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ztbnl\" (UID: \"6cab315a-9130-4a94-88ff-6ef8e5291d77\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" Apr 24 21:29:40.727773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.727601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.727773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.727632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.727773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.727671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-crio-socket\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.727773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.727769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-crio-socket\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.728046 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.727881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-data-volume\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.728301 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.728279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.729911 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.729892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6cab315a-9130-4a94-88ff-6ef8e5291d77-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ztbnl\" (UID: \"6cab315a-9130-4a94-88ff-6ef8e5291d77\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" Apr 24 21:29:40.755057 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.755028 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j86fl\" (UniqueName: \"kubernetes.io/projected/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-kube-api-access-j86fl\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:40.766651 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.766631 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-869bd4699f-pxgbj"] Apr 24 21:29:40.769864 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.769850 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.788594 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.788572 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-869bd4699f-pxgbj"] Apr 24 21:29:40.828195 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.828167 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24138133-faf2-4505-9acb-d85e76bf96d3-ca-trust-extracted\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.828319 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.828208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24138133-faf2-4505-9acb-d85e76bf96d3-installation-pull-secrets\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.828319 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.828230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/24138133-faf2-4505-9acb-d85e76bf96d3-image-registry-private-configuration\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.828319 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.828285 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8p67\" (UniqueName: \"kubernetes.io/projected/24138133-faf2-4505-9acb-d85e76bf96d3-kube-api-access-m8p67\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.828319 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.828303 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24138133-faf2-4505-9acb-d85e76bf96d3-registry-tls\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.828468 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.828323 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24138133-faf2-4505-9acb-d85e76bf96d3-bound-sa-token\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.828468 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.828374 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24138133-faf2-4505-9acb-d85e76bf96d3-trusted-ca\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.828468 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.828428 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24138133-faf2-4505-9acb-d85e76bf96d3-registry-certificates\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.922132 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.922046 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" Apr 24 21:29:40.929153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.929125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24138133-faf2-4505-9acb-d85e76bf96d3-installation-pull-secrets\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.929287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.929159 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/24138133-faf2-4505-9acb-d85e76bf96d3-image-registry-private-configuration\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.929287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.929203 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8p67\" (UniqueName: \"kubernetes.io/projected/24138133-faf2-4505-9acb-d85e76bf96d3-kube-api-access-m8p67\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.929287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.929232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24138133-faf2-4505-9acb-d85e76bf96d3-registry-tls\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.929287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.929264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24138133-faf2-4505-9acb-d85e76bf96d3-bound-sa-token\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.929556 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.929307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24138133-faf2-4505-9acb-d85e76bf96d3-trusted-ca\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.929556 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.929355 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24138133-faf2-4505-9acb-d85e76bf96d3-registry-certificates\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.929556 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.929391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24138133-faf2-4505-9acb-d85e76bf96d3-ca-trust-extracted\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.929813 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.929760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24138133-faf2-4505-9acb-d85e76bf96d3-ca-trust-extracted\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.930628 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.930605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24138133-faf2-4505-9acb-d85e76bf96d3-registry-certificates\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.930835 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.930807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24138133-faf2-4505-9acb-d85e76bf96d3-trusted-ca\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.932110 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.932091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24138133-faf2-4505-9acb-d85e76bf96d3-installation-pull-secrets\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.932191 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.932118 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/24138133-faf2-4505-9acb-d85e76bf96d3-image-registry-private-configuration\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.932711 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.932693 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24138133-faf2-4505-9acb-d85e76bf96d3-registry-tls\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.941492 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.941466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8p67\" (UniqueName: \"kubernetes.io/projected/24138133-faf2-4505-9acb-d85e76bf96d3-kube-api-access-m8p67\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:40.941803 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:40.941773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24138133-faf2-4505-9acb-d85e76bf96d3-bound-sa-token\") pod \"image-registry-869bd4699f-pxgbj\" (UID: \"24138133-faf2-4505-9acb-d85e76bf96d3\") " pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:41.058198 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:41.058000 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl"] Apr 24 21:29:41.060051 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:41.060027 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cab315a_9130_4a94_88ff_6ef8e5291d77.slice/crio-945054820ec04c3adfde5cc656dbf01ed510e79b5e0b6bd6321b3eb563bab27a WatchSource:0}: Error finding container 945054820ec04c3adfde5cc656dbf01ed510e79b5e0b6bd6321b3eb563bab27a: Status 404 returned error can't find the container with id 945054820ec04c3adfde5cc656dbf01ed510e79b5e0b6bd6321b3eb563bab27a Apr 24 21:29:41.078839 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:41.078817 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:41.212345 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:41.212266 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-869bd4699f-pxgbj"] Apr 24 21:29:41.215718 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:41.215688 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24138133_faf2_4505_9acb_d85e76bf96d3.slice/crio-28c1fa275376c050da57a8bbd0f7112b42bd22f21cc52822aefa75e5b066eed1 WatchSource:0}: Error finding container 28c1fa275376c050da57a8bbd0f7112b42bd22f21cc52822aefa75e5b066eed1: Status 404 returned error can't find the container with id 28c1fa275376c050da57a8bbd0f7112b42bd22f21cc52822aefa75e5b066eed1 Apr 24 21:29:41.727800 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:41.727764 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: failed to sync secret cache: timed out waiting for the condition Apr 24 21:29:41.728272 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:41.727860 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-insights-runtime-extractor-tls podName:511a2fb0-bcb1-4164-92c0-072aaaa01cf3 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:42.227841159 +0000 UTC m=+167.507688937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xwrsb" (UID: "511a2fb0-bcb1-4164-92c0-072aaaa01cf3") : failed to sync secret cache: timed out waiting for the condition Apr 24 21:29:41.791155 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:41.791118 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:29:41.910842 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:41.910801 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" event={"ID":"24138133-faf2-4505-9acb-d85e76bf96d3","Type":"ContainerStarted","Data":"1fedd9bf0c752b3d41d2040d64f17bbd2c7057ce7d5be0a639f1d77970af764a"} Apr 24 21:29:41.910842 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:41.910846 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" event={"ID":"24138133-faf2-4505-9acb-d85e76bf96d3","Type":"ContainerStarted","Data":"28c1fa275376c050da57a8bbd0f7112b42bd22f21cc52822aefa75e5b066eed1"} Apr 24 21:29:41.911100 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:41.910958 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:29:41.912071 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:41.912042 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" event={"ID":"6cab315a-9130-4a94-88ff-6ef8e5291d77","Type":"ContainerStarted","Data":"945054820ec04c3adfde5cc656dbf01ed510e79b5e0b6bd6321b3eb563bab27a"} Apr 24 21:29:41.978533 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:41.978433 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" podStartSLOduration=1.9784174220000001 podStartE2EDuration="1.978417422s" podCreationTimestamp="2026-04-24 21:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:41.974694443 +0000 UTC m=+167.254542254" watchObservedRunningTime="2026-04-24 21:29:41.978417422 +0000 UTC m=+167.258265201" Apr 24 21:29:42.242208 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.242121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:42.244371 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.244338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/511a2fb0-bcb1-4164-92c0-072aaaa01cf3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xwrsb\" (UID: \"511a2fb0-bcb1-4164-92c0-072aaaa01cf3\") " pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:42.426606 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.426574 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xwrsb" Apr 24 21:29:42.549063 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.549031 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xwrsb"] Apr 24 21:29:42.552259 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:42.552232 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod511a2fb0_bcb1_4164_92c0_072aaaa01cf3.slice/crio-04c19d9e398f579c137998cb203716f832902068e7f0aeeac89e3da199a491f9 WatchSource:0}: Error finding container 04c19d9e398f579c137998cb203716f832902068e7f0aeeac89e3da199a491f9: Status 404 returned error can't find the container with id 04c19d9e398f579c137998cb203716f832902068e7f0aeeac89e3da199a491f9 Apr 24 21:29:42.916664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.916623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" event={"ID":"6cab315a-9130-4a94-88ff-6ef8e5291d77","Type":"ContainerStarted","Data":"b0acf8d687a66770dcbc8b9855b84567d2e029a237e050dabaf58217b17b09a0"} Apr 24 21:29:42.917234 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.917013 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" Apr 24 21:29:42.918320 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.918281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xwrsb" event={"ID":"511a2fb0-bcb1-4164-92c0-072aaaa01cf3","Type":"ContainerStarted","Data":"d0e6484304ce8db11c4c3818720017b0e9d2c8b08da838f0a18fec7932211882"} Apr 24 21:29:42.918464 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.918320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xwrsb" event={"ID":"511a2fb0-bcb1-4164-92c0-072aaaa01cf3","Type":"ContainerStarted","Data":"04c19d9e398f579c137998cb203716f832902068e7f0aeeac89e3da199a491f9"} Apr 24 21:29:42.923294 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.923266 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" Apr 24 21:29:42.933376 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:42.933300 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ztbnl" podStartSLOduration=1.8455756239999999 podStartE2EDuration="2.933284997s" podCreationTimestamp="2026-04-24 21:29:40 +0000 UTC" firstStartedPulling="2026-04-24 21:29:41.061875313 +0000 UTC m=+166.341723070" lastFinishedPulling="2026-04-24 21:29:42.149584686 +0000 UTC m=+167.429432443" observedRunningTime="2026-04-24 21:29:42.93322536 +0000 UTC m=+168.213073153" watchObservedRunningTime="2026-04-24 21:29:42.933284997 +0000 UTC m=+168.213132778" Apr 24 21:29:43.922673 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:43.922573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xwrsb" event={"ID":"511a2fb0-bcb1-4164-92c0-072aaaa01cf3","Type":"ContainerStarted","Data":"c6944b46b3687670725384f12b3756798c16bddbe05132b02d933fba10bbc060"} Apr 24 21:29:44.364124 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:44.364090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:29:44.926998 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:44.926954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xwrsb" event={"ID":"511a2fb0-bcb1-4164-92c0-072aaaa01cf3","Type":"ContainerStarted","Data":"0420bfc500881fd315cf5ce5a38c6dabb03bd4f85ac7169ce6cd32d4c18e41ec"} Apr 24 21:29:44.948557 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:44.948511 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xwrsb" podStartSLOduration=3.14701418 podStartE2EDuration="4.948497428s" podCreationTimestamp="2026-04-24 21:29:40 +0000 UTC" firstStartedPulling="2026-04-24 21:29:42.610400535 +0000 UTC m=+167.890248301" lastFinishedPulling="2026-04-24 21:29:44.411883791 +0000 UTC m=+169.691731549" observedRunningTime="2026-04-24 21:29:44.946743948 +0000 UTC m=+170.226591765" watchObservedRunningTime="2026-04-24 21:29:44.948497428 +0000 UTC m=+170.228345271" Apr 24 21:29:45.368894 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:45.367705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:29:45.370230 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:45.370209 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-42hwh\"" Apr 24 21:29:45.378095 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:45.378081 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6l98d" Apr 24 21:29:45.492687 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:45.492653 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6l98d"] Apr 24 21:29:45.495336 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:45.495304 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff600116_8b92_45dd_8c1f_07b5c9151008.slice/crio-cbf7ab55b108875120cffc61059f27c00d9b1122e4295db8d8eee728cea61945 WatchSource:0}: Error finding container cbf7ab55b108875120cffc61059f27c00d9b1122e4295db8d8eee728cea61945: Status 404 returned error can't find the container with id cbf7ab55b108875120cffc61059f27c00d9b1122e4295db8d8eee728cea61945 Apr 24 21:29:45.930809 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:45.930768 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6l98d" event={"ID":"ff600116-8b92-45dd-8c1f-07b5c9151008","Type":"ContainerStarted","Data":"cbf7ab55b108875120cffc61059f27c00d9b1122e4295db8d8eee728cea61945"} Apr 24 21:29:47.827459 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.827430 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9t7t2"] Apr 24 21:29:47.830639 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.830620 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.833761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.833736 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-stmx8\"" Apr 24 21:29:47.833869 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.833762 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:29:47.833869 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.833738 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:29:47.834099 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.834073 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:29:47.834358 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.834342 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:29:47.887375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.887344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.887535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.887407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmb8g\" (UniqueName: \"kubernetes.io/projected/499e5975-74b1-4afc-9a86-a012675aa62d-kube-api-access-dmb8g\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.887535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.887443 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-accelerators-collector-config\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.887535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.887478 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/499e5975-74b1-4afc-9a86-a012675aa62d-root\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.887535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.887510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-wtmp\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.887740 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.887568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-textfile\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.887740 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.887604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/499e5975-74b1-4afc-9a86-a012675aa62d-metrics-client-ca\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.887740 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.887630 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/499e5975-74b1-4afc-9a86-a012675aa62d-sys\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.887740 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.887655 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-tls\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.902686 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.902664 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-n6xn7" Apr 24 21:29:47.938674 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.938641 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6l98d" event={"ID":"ff600116-8b92-45dd-8c1f-07b5c9151008","Type":"ContainerStarted","Data":"e189762e14aa536f00e7ff4f4d1b5a7ba598d3abdee32a719d24820be12c9d0e"} Apr 24 21:29:47.963261 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.963211 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6l98d" podStartSLOduration=138.258033681 podStartE2EDuration="2m19.963196736s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:29:45.497198962 +0000 UTC m=+170.777046720" lastFinishedPulling="2026-04-24 21:29:47.202362002 +0000 UTC m=+172.482209775" observedRunningTime="2026-04-24 21:29:47.9621927 +0000 UTC m=+173.242040484" watchObservedRunningTime="2026-04-24 21:29:47.963196736 +0000 UTC m=+173.243044516" Apr 24 21:29:47.988811 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.988778 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-textfile\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.989001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.988840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/499e5975-74b1-4afc-9a86-a012675aa62d-metrics-client-ca\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.989001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.988869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/499e5975-74b1-4afc-9a86-a012675aa62d-sys\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.989001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.988906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-tls\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.989001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.988989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.989001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.988989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/499e5975-74b1-4afc-9a86-a012675aa62d-sys\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.989259 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:47.989245 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:29:47.989338 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:47.989317 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-tls podName:499e5975-74b1-4afc-9a86-a012675aa62d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:48.489296513 +0000 UTC m=+173.769144276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-tls") pod "node-exporter-9t7t2" (UID: "499e5975-74b1-4afc-9a86-a012675aa62d") : secret "node-exporter-tls" not found Apr 24 21:29:47.989649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.989626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/499e5975-74b1-4afc-9a86-a012675aa62d-metrics-client-ca\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.989803 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.989644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmb8g\" (UniqueName: \"kubernetes.io/projected/499e5975-74b1-4afc-9a86-a012675aa62d-kube-api-access-dmb8g\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.990037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.989701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-textfile\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.990037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.989906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-accelerators-collector-config\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.990037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.989993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/499e5975-74b1-4afc-9a86-a012675aa62d-root\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.990037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.990030 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-wtmp\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.990313 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.990098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/499e5975-74b1-4afc-9a86-a012675aa62d-root\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.990313 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.990170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-wtmp\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.990460 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.990390 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-accelerators-collector-config\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:47.991557 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:47.991534 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:48.002145 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.002122 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmb8g\" (UniqueName: \"kubernetes.io/projected/499e5975-74b1-4afc-9a86-a012675aa62d-kube-api-access-dmb8g\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:48.495244 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.495207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-tls\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:48.497471 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.497446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/499e5975-74b1-4afc-9a86-a012675aa62d-node-exporter-tls\") pod \"node-exporter-9t7t2\" (UID: \"499e5975-74b1-4afc-9a86-a012675aa62d\") " pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:48.740246 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.740206 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9t7t2" Apr 24 21:29:48.861270 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.861230 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:48.866706 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.866682 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.869606 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.869584 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:29:48.869718 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.869610 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:29:48.869718 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.869589 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:29:48.869718 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.869630 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:29:48.870170 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.870011 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-94xsz\"" Apr 24 21:29:48.870170 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.870059 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:29:48.870170 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.870069 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:29:48.870170 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.870126 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:29:48.870170 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.870163 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:29:48.870478 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.870402 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:29:48.873793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.873757 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:48.899445 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899475 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899530 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-config-out\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899596 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z44w8\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-kube-api-access-z44w8\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899743 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899950 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899801 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-web-config\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899950 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.899950 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899885 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.900063 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.899974 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-config-volume\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:48.942993 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:48.942954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9t7t2" event={"ID":"499e5975-74b1-4afc-9a86-a012675aa62d","Type":"ContainerStarted","Data":"184a8e1df9c8c7e2e47fd0b925a20191816ff02a4fb1adb5600ace1b4b587fd6"} Apr 24 21:29:49.001345 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001527 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001354 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001527 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-config-volume\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001637 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001637 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001616 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001754 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001652 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-config-out\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001754 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001754 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001897 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001762 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001897 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001796 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001897 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z44w8\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-kube-api-access-z44w8\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.001897 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.002124 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.001896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-web-config\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.004045 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.002417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.004045 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.002668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.004045 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.003660 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.004960 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.004790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.005055 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.004973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.005055 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.005044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-web-config\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.005512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.005462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.005512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.005476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.006226 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.006193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-config-volume\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.006512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.006492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.006589 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.006553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-config-out\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.006589 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.006581 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.010460 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.010441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z44w8\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-kube-api-access-z44w8\") pod \"alertmanager-main-0\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.178407 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.178372 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:49.324027 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.324000 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:49.326680 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:29:49.326647 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e893786_b818_4a95_b817_be883b718d44.slice/crio-853c34b4281956781390d459180efff38f8bdd1cfb0fa9efd28b07498e23fd68 WatchSource:0}: Error finding container 853c34b4281956781390d459180efff38f8bdd1cfb0fa9efd28b07498e23fd68: Status 404 returned error can't find the container with id 853c34b4281956781390d459180efff38f8bdd1cfb0fa9efd28b07498e23fd68 Apr 24 21:29:49.947120 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:49.947083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerStarted","Data":"853c34b4281956781390d459180efff38f8bdd1cfb0fa9efd28b07498e23fd68"} Apr 24 21:29:50.658201 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:50.658170 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:29:50.951657 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:50.951569 2573 generic.go:358] "Generic (PLEG): container finished" podID="499e5975-74b1-4afc-9a86-a012675aa62d" containerID="e19409c5752a5632043ec0440b5f691545af9392b77d5b2b26141aa1f6a6b215" exitCode=0 Apr 24 21:29:50.952133 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:50.951659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9t7t2" event={"ID":"499e5975-74b1-4afc-9a86-a012675aa62d","Type":"ContainerDied","Data":"e19409c5752a5632043ec0440b5f691545af9392b77d5b2b26141aa1f6a6b215"} Apr 24 21:29:50.953105 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:50.953071 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e893786-b818-4a95-b817-be883b718d44" containerID="52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384" exitCode=0 Apr 24 21:29:50.953212 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:50.953109 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerDied","Data":"52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384"} Apr 24 21:29:51.958432 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:51.958395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9t7t2" event={"ID":"499e5975-74b1-4afc-9a86-a012675aa62d","Type":"ContainerStarted","Data":"284b0faad2f0d887a880e7db774b608b3085a57acc50a7f5d5ac9e787a56176a"} Apr 24 21:29:51.958432 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:51.958437 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9t7t2" event={"ID":"499e5975-74b1-4afc-9a86-a012675aa62d","Type":"ContainerStarted","Data":"8b32445882e376114d485db06be5459973735336a853b697b8641cbddd6e3743"} Apr 24 21:29:51.983445 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:51.983388 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9t7t2" podStartSLOduration=3.593719432 podStartE2EDuration="4.983373958s" podCreationTimestamp="2026-04-24 21:29:47 +0000 UTC" firstStartedPulling="2026-04-24 21:29:48.755475006 +0000 UTC m=+174.035322765" lastFinishedPulling="2026-04-24 21:29:50.145129529 +0000 UTC m=+175.424977291" observedRunningTime="2026-04-24 21:29:51.982435719 +0000 UTC m=+177.262283503" watchObservedRunningTime="2026-04-24 21:29:51.983373958 +0000 UTC m=+177.263221738" Apr 24 21:29:52.964968 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:52.964931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerStarted","Data":"8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06"} Apr 24 21:29:52.965380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:52.964977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerStarted","Data":"eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d"} Apr 24 21:29:52.965380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:52.964991 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerStarted","Data":"8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2"} Apr 24 21:29:52.965380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:52.965000 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerStarted","Data":"228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b"} Apr 24 21:29:52.965380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:52.965011 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerStarted","Data":"77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401"} Apr 24 21:29:53.970351 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:53.970316 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerStarted","Data":"7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4"} Apr 24 21:29:54.000345 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:54.000286 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.248354493 podStartE2EDuration="6.000270815s" podCreationTimestamp="2026-04-24 21:29:48 +0000 UTC" firstStartedPulling="2026-04-24 21:29:49.32905169 +0000 UTC m=+174.608899451" lastFinishedPulling="2026-04-24 21:29:53.08096801 +0000 UTC m=+178.360815773" observedRunningTime="2026-04-24 21:29:54.000088642 +0000 UTC m=+179.279936437" watchObservedRunningTime="2026-04-24 21:29:54.000270815 +0000 UTC m=+179.280118594" Apr 24 21:30:02.922049 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:02.922020 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-869bd4699f-pxgbj" Apr 24 21:30:05.672821 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:05.672763 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" podUID="147a51c4-da64-463a-bc56-329ff8627c5d" containerName="registry" containerID="cri-o://81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c" gracePeriod=30 Apr 24 21:30:05.904475 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:05.904453 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:30:06.002491 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.002458 2573 generic.go:358] "Generic (PLEG): container finished" podID="147a51c4-da64-463a-bc56-329ff8627c5d" containerID="81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c" exitCode=0 Apr 24 21:30:06.002667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.002519 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" Apr 24 21:30:06.002667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.002543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" event={"ID":"147a51c4-da64-463a-bc56-329ff8627c5d","Type":"ContainerDied","Data":"81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c"} Apr 24 21:30:06.002667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.002583 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58cd6868bf-d5l8g" event={"ID":"147a51c4-da64-463a-bc56-329ff8627c5d","Type":"ContainerDied","Data":"8ec38194914b308c91f9757f22aa12de7b1a5a3ef83f1294becc90a84b440ade"} Apr 24 21:30:06.002667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.002603 2573 scope.go:117] "RemoveContainer" containerID="81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c" Apr 24 21:30:06.014424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.014398 2573 scope.go:117] "RemoveContainer" containerID="81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c" Apr 24 21:30:06.014690 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:30:06.014667 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c\": container with ID starting with 81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c not found: ID does not exist" containerID="81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c" Apr 24 21:30:06.014747 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.014700 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c"} err="failed to get container status \"81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c\": rpc error: code = NotFound desc = could not find container \"81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c\": container with ID starting with 81969aedc176e9ac92522d8a2b20d5dca42362ae8dc6e533d8a54c2c845cea4c not found: ID does not exist" Apr 24 21:30:06.057051 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057020 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-image-registry-private-configuration\") pod \"147a51c4-da64-463a-bc56-329ff8627c5d\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " Apr 24 21:30:06.057182 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057065 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") pod \"147a51c4-da64-463a-bc56-329ff8627c5d\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " Apr 24 21:30:06.057182 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057096 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9zb4\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-kube-api-access-f9zb4\") pod \"147a51c4-da64-463a-bc56-329ff8627c5d\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " Apr 24 21:30:06.057182 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057126 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-bound-sa-token\") pod \"147a51c4-da64-463a-bc56-329ff8627c5d\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " Apr 24 21:30:06.057182 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057156 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-trusted-ca\") pod \"147a51c4-da64-463a-bc56-329ff8627c5d\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " Apr 24 21:30:06.057182 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057171 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-registry-certificates\") pod \"147a51c4-da64-463a-bc56-329ff8627c5d\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " Apr 24 21:30:06.057413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057208 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/147a51c4-da64-463a-bc56-329ff8627c5d-ca-trust-extracted\") pod \"147a51c4-da64-463a-bc56-329ff8627c5d\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " Apr 24 21:30:06.057413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057271 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-installation-pull-secrets\") pod \"147a51c4-da64-463a-bc56-329ff8627c5d\" (UID: \"147a51c4-da64-463a-bc56-329ff8627c5d\") " Apr 24 21:30:06.057643 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057614 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "147a51c4-da64-463a-bc56-329ff8627c5d" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:06.057777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.057747 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "147a51c4-da64-463a-bc56-329ff8627c5d" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:06.059636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.059603 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "147a51c4-da64-463a-bc56-329ff8627c5d" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:06.059731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.059625 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "147a51c4-da64-463a-bc56-329ff8627c5d" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:06.059788 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.059749 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-kube-api-access-f9zb4" (OuterVolumeSpecName: "kube-api-access-f9zb4") pod "147a51c4-da64-463a-bc56-329ff8627c5d" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d"). InnerVolumeSpecName "kube-api-access-f9zb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:06.059788 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.059763 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "147a51c4-da64-463a-bc56-329ff8627c5d" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:06.059893 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.059842 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "147a51c4-da64-463a-bc56-329ff8627c5d" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:06.065641 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.065615 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147a51c4-da64-463a-bc56-329ff8627c5d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "147a51c4-da64-463a-bc56-329ff8627c5d" (UID: "147a51c4-da64-463a-bc56-329ff8627c5d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:06.158804 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.158767 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-image-registry-private-configuration\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.158804 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.158799 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-registry-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.159011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.158816 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f9zb4\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-kube-api-access-f9zb4\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.159011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.158830 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/147a51c4-da64-463a-bc56-329ff8627c5d-bound-sa-token\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.159011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.158842 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-trusted-ca\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.159011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.158854 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/147a51c4-da64-463a-bc56-329ff8627c5d-registry-certificates\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.159011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.158866 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/147a51c4-da64-463a-bc56-329ff8627c5d-ca-trust-extracted\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.159011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.158879 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/147a51c4-da64-463a-bc56-329ff8627c5d-installation-pull-secrets\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:30:06.325562 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.325530 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-58cd6868bf-d5l8g"] Apr 24 21:30:06.327507 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:06.327487 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-58cd6868bf-d5l8g"] Apr 24 21:30:07.367586 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:07.367551 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147a51c4-da64-463a-bc56-329ff8627c5d" path="/var/lib/kubelet/pods/147a51c4-da64-463a-bc56-329ff8627c5d/volumes" Apr 24 21:30:08.093958 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.093898 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-jrtmd"] Apr 24 21:30:08.094273 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.094258 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="147a51c4-da64-463a-bc56-329ff8627c5d" containerName="registry" Apr 24 21:30:08.094336 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.094276 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="147a51c4-da64-463a-bc56-329ff8627c5d" containerName="registry" Apr 24 21:30:08.094373 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.094345 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="147a51c4-da64-463a-bc56-329ff8627c5d" containerName="registry" Apr 24 21:30:08.099083 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.099062 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jrtmd" Apr 24 21:30:08.101910 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.101891 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:30:08.102048 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.101966 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:30:08.102100 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.102055 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5s2jj\"" Apr 24 21:30:08.108551 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.108527 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jrtmd"] Apr 24 21:30:08.173643 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.173597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbm2h\" (UniqueName: \"kubernetes.io/projected/c85fb73a-d28f-47c4-8a91-b0890eced33b-kube-api-access-vbm2h\") pod \"downloads-6bcc868b7-jrtmd\" (UID: \"c85fb73a-d28f-47c4-8a91-b0890eced33b\") " pod="openshift-console/downloads-6bcc868b7-jrtmd" Apr 24 21:30:08.275131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.275093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbm2h\" (UniqueName: \"kubernetes.io/projected/c85fb73a-d28f-47c4-8a91-b0890eced33b-kube-api-access-vbm2h\") pod \"downloads-6bcc868b7-jrtmd\" (UID: \"c85fb73a-d28f-47c4-8a91-b0890eced33b\") " pod="openshift-console/downloads-6bcc868b7-jrtmd" Apr 24 21:30:08.287562 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.287529 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbm2h\" (UniqueName: \"kubernetes.io/projected/c85fb73a-d28f-47c4-8a91-b0890eced33b-kube-api-access-vbm2h\") pod \"downloads-6bcc868b7-jrtmd\" (UID: \"c85fb73a-d28f-47c4-8a91-b0890eced33b\") " pod="openshift-console/downloads-6bcc868b7-jrtmd" Apr 24 21:30:08.409339 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.409238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jrtmd" Apr 24 21:30:08.545160 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:08.545129 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jrtmd"] Apr 24 21:30:08.547863 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:30:08.547833 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc85fb73a_d28f_47c4_8a91_b0890eced33b.slice/crio-e971273c4655fe5135d47ff35b65f001c7dc31f8e2f49b14dac3175158012d3c WatchSource:0}: Error finding container e971273c4655fe5135d47ff35b65f001c7dc31f8e2f49b14dac3175158012d3c: Status 404 returned error can't find the container with id e971273c4655fe5135d47ff35b65f001c7dc31f8e2f49b14dac3175158012d3c Apr 24 21:30:09.015748 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:09.015707 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jrtmd" event={"ID":"c85fb73a-d28f-47c4-8a91-b0890eced33b","Type":"ContainerStarted","Data":"e971273c4655fe5135d47ff35b65f001c7dc31f8e2f49b14dac3175158012d3c"} Apr 24 21:30:14.182390 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.182358 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-965bb6f75-dsbwk"] Apr 24 21:30:14.185527 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.185499 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.187971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.187904 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:30:14.188099 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.187987 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:30:14.189013 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.188810 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:30:14.189013 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.188832 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:30:14.189013 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.188845 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:30:14.189013 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.188874 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-fpjg5\"" Apr 24 21:30:14.195829 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.195808 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-965bb6f75-dsbwk"] Apr 24 21:30:14.335305 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.335267 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-oauth-config\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.335489 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.335329 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-console-config\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.335489 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.335356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-oauth-serving-cert\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.335489 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.335454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-service-ca\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.335645 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.335491 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-serving-cert\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.335645 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.335538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdqvt\" (UniqueName: \"kubernetes.io/projected/b828e5eb-8475-4d63-8481-b50fc2a13370-kube-api-access-zdqvt\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.436905 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.436747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdqvt\" (UniqueName: \"kubernetes.io/projected/b828e5eb-8475-4d63-8481-b50fc2a13370-kube-api-access-zdqvt\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.436905 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.436808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-oauth-config\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.436905 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.436856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-console-config\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.436905 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.436874 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-oauth-serving-cert\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.437281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.436933 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-service-ca\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.437281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.436966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-serving-cert\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.437694 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.437658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-oauth-serving-cert\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.437694 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.437670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-service-ca\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.438279 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.438247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-console-config\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.439653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.439629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-oauth-config\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.439771 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.439736 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-serving-cert\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.446176 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.446137 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdqvt\" (UniqueName: \"kubernetes.io/projected/b828e5eb-8475-4d63-8481-b50fc2a13370-kube-api-access-zdqvt\") pod \"console-965bb6f75-dsbwk\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.497811 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.497781 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:14.644555 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:14.644530 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-965bb6f75-dsbwk"] Apr 24 21:30:14.646880 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:30:14.646849 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb828e5eb_8475_4d63_8481_b50fc2a13370.slice/crio-1786f7d406a718703954dd37ff4d4ae90ffdf9d6f7cd5fc3ce2e6200306cfa6c WatchSource:0}: Error finding container 1786f7d406a718703954dd37ff4d4ae90ffdf9d6f7cd5fc3ce2e6200306cfa6c: Status 404 returned error can't find the container with id 1786f7d406a718703954dd37ff4d4ae90ffdf9d6f7cd5fc3ce2e6200306cfa6c Apr 24 21:30:15.037095 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:15.037047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-965bb6f75-dsbwk" event={"ID":"b828e5eb-8475-4d63-8481-b50fc2a13370","Type":"ContainerStarted","Data":"1786f7d406a718703954dd37ff4d4ae90ffdf9d6f7cd5fc3ce2e6200306cfa6c"} Apr 24 21:30:24.599498 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.599417 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68f8758848-qqpg2"] Apr 24 21:30:24.604064 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.604042 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.624342 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.624304 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f8758848-qqpg2"] Apr 24 21:30:24.625836 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.625697 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:30:24.638882 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.638848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-oauth-config\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.639056 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.638890 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-console-config\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.639056 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.638943 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-oauth-serving-cert\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.639056 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.638995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-trusted-ca-bundle\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.639212 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.639120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pf6\" (UniqueName: \"kubernetes.io/projected/050ba746-02c0-4456-888d-5d244d48df3b-kube-api-access-45pf6\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.639212 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.639163 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-service-ca\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.639212 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.639189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-serving-cert\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.740610 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.740572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-oauth-config\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.740818 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.740621 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-console-config\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.740818 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.740650 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-oauth-serving-cert\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.740818 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.740669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-trusted-ca-bundle\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.741015 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.740888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45pf6\" (UniqueName: \"kubernetes.io/projected/050ba746-02c0-4456-888d-5d244d48df3b-kube-api-access-45pf6\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.741015 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.740955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-service-ca\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.741015 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.740992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-serving-cert\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.741526 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.741498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-console-config\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.741657 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.741544 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-oauth-serving-cert\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.741724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.741658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-trusted-ca-bundle\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.741806 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.741781 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-service-ca\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.743425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.743401 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-oauth-config\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.743788 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.743761 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-serving-cert\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.754806 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.754781 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pf6\" (UniqueName: \"kubernetes.io/projected/050ba746-02c0-4456-888d-5d244d48df3b-kube-api-access-45pf6\") pod \"console-68f8758848-qqpg2\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:24.913823 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:24.913723 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:25.062812 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:25.062773 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f8758848-qqpg2"] Apr 24 21:30:25.066594 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:30:25.066555 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod050ba746_02c0_4456_888d_5d244d48df3b.slice/crio-0f184f9da294473c9a749a2c9606fcd166b597d6f8bab46b1214b8414b6d6359 WatchSource:0}: Error finding container 0f184f9da294473c9a749a2c9606fcd166b597d6f8bab46b1214b8414b6d6359: Status 404 returned error can't find the container with id 0f184f9da294473c9a749a2c9606fcd166b597d6f8bab46b1214b8414b6d6359 Apr 24 21:30:25.068998 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:25.068953 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jrtmd" event={"ID":"c85fb73a-d28f-47c4-8a91-b0890eced33b","Type":"ContainerStarted","Data":"3d8743c746d52d3ae92c1572fa21172485f7303106976e6c6e511a728b22da67"} Apr 24 21:30:25.069177 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:25.069159 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-jrtmd" Apr 24 21:30:25.070574 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:25.070533 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-965bb6f75-dsbwk" event={"ID":"b828e5eb-8475-4d63-8481-b50fc2a13370","Type":"ContainerStarted","Data":"ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86"} Apr 24 21:30:25.079411 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:25.079385 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-jrtmd" Apr 24 21:30:25.098127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:25.098070 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-jrtmd" podStartSLOduration=1.312153772 podStartE2EDuration="17.098051443s" podCreationTimestamp="2026-04-24 21:30:08 +0000 UTC" firstStartedPulling="2026-04-24 21:30:08.549746833 +0000 UTC m=+193.829594598" lastFinishedPulling="2026-04-24 21:30:24.335644496 +0000 UTC m=+209.615492269" observedRunningTime="2026-04-24 21:30:25.095658498 +0000 UTC m=+210.375506279" watchObservedRunningTime="2026-04-24 21:30:25.098051443 +0000 UTC m=+210.377899224" Apr 24 21:30:25.140416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:25.140369 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-965bb6f75-dsbwk" podStartSLOduration=1.495349185 podStartE2EDuration="11.140350382s" podCreationTimestamp="2026-04-24 21:30:14 +0000 UTC" firstStartedPulling="2026-04-24 21:30:14.649148284 +0000 UTC m=+199.928996045" lastFinishedPulling="2026-04-24 21:30:24.294149477 +0000 UTC m=+209.573997242" observedRunningTime="2026-04-24 21:30:25.118631597 +0000 UTC m=+210.398479376" watchObservedRunningTime="2026-04-24 21:30:25.140350382 +0000 UTC m=+210.420198164" Apr 24 21:30:26.075117 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:26.075067 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f8758848-qqpg2" event={"ID":"050ba746-02c0-4456-888d-5d244d48df3b","Type":"ContainerStarted","Data":"04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af"} Apr 24 21:30:26.075559 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:26.075134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f8758848-qqpg2" event={"ID":"050ba746-02c0-4456-888d-5d244d48df3b","Type":"ContainerStarted","Data":"0f184f9da294473c9a749a2c9606fcd166b597d6f8bab46b1214b8414b6d6359"} Apr 24 21:30:26.098472 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:26.098419 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68f8758848-qqpg2" podStartSLOduration=2.0984023450000002 podStartE2EDuration="2.098402345s" podCreationTimestamp="2026-04-24 21:30:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:26.095998784 +0000 UTC m=+211.375846578" watchObservedRunningTime="2026-04-24 21:30:26.098402345 +0000 UTC m=+211.378250126" Apr 24 21:30:34.498090 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:34.498054 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:34.498090 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:34.498099 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:34.502792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:34.502770 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:34.914316 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:34.914273 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:34.914492 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:34.914330 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:34.919205 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:34.919179 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:35.106900 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:35.106871 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:30:35.107329 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:35.107306 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:30:35.175799 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:35.175732 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-965bb6f75-dsbwk"] Apr 24 21:30:37.109063 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:37.109024 2573 generic.go:358] "Generic (PLEG): container finished" podID="fae90115-d8d4-4eb9-be87-cee9f9fded69" containerID="475ada3e97fcba48e05999940a5a0c5bc57242da5977279596d60070554e3cc4" exitCode=0 Apr 24 21:30:37.109495 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:37.109101 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" event={"ID":"fae90115-d8d4-4eb9-be87-cee9f9fded69","Type":"ContainerDied","Data":"475ada3e97fcba48e05999940a5a0c5bc57242da5977279596d60070554e3cc4"} Apr 24 21:30:37.109495 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:37.109481 2573 scope.go:117] "RemoveContainer" containerID="475ada3e97fcba48e05999940a5a0c5bc57242da5977279596d60070554e3cc4" Apr 24 21:30:37.110370 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:37.110351 2573 generic.go:358] "Generic (PLEG): container finished" podID="73ca8074-c925-4c71-a52a-9bdc355c56df" containerID="697e6540bcd223a77864e4569f6bc2a75065b8fb89dd2f648f770eb8a4528605" exitCode=0 Apr 24 21:30:37.110450 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:37.110428 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lm6jb" event={"ID":"73ca8074-c925-4c71-a52a-9bdc355c56df","Type":"ContainerDied","Data":"697e6540bcd223a77864e4569f6bc2a75065b8fb89dd2f648f770eb8a4528605"} Apr 24 21:30:37.110846 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:37.110830 2573 scope.go:117] "RemoveContainer" containerID="697e6540bcd223a77864e4569f6bc2a75065b8fb89dd2f648f770eb8a4528605" Apr 24 21:30:37.929254 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:37.929221 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n6xn7_a8f3bdc0-c9cc-4161-9c81-77828c331c3b/dns/0.log" Apr 24 21:30:37.938030 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:37.938009 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n6xn7_a8f3bdc0-c9cc-4161-9c81-77828c331c3b/kube-rbac-proxy/0.log" Apr 24 21:30:38.116307 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:38.116267 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lm6jb" event={"ID":"73ca8074-c925-4c71-a52a-9bdc355c56df","Type":"ContainerStarted","Data":"481b0b2328270387a3b4e2002530ef389a5424097810d8913b775cad8cade2d7"} Apr 24 21:30:38.118971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:38.118945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h7k6n" event={"ID":"fae90115-d8d4-4eb9-be87-cee9f9fded69","Type":"ContainerStarted","Data":"39435c0986c978c0f8ce3b700459c30e45a1fa8ca2257b500b12dedb6d683625"} Apr 24 21:30:38.481796 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:38.481765 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zvvmj_fc418569-4514-4b49-bd55-839ecdb097d5/dns-node-resolver/0.log" Apr 24 21:30:42.132379 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:42.132343 2573 generic.go:358] "Generic (PLEG): container finished" podID="a11ab57b-145c-4043-bbce-507e3d1017ec" containerID="190194cc9405c0ed57495edbb859d80201e19f508cabe049b891a3b447c8a0cb" exitCode=0 Apr 24 21:30:42.132776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:42.132430 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" event={"ID":"a11ab57b-145c-4043-bbce-507e3d1017ec","Type":"ContainerDied","Data":"190194cc9405c0ed57495edbb859d80201e19f508cabe049b891a3b447c8a0cb"} Apr 24 21:30:42.132824 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:42.132781 2573 scope.go:117] "RemoveContainer" containerID="190194cc9405c0ed57495edbb859d80201e19f508cabe049b891a3b447c8a0cb" Apr 24 21:30:43.139521 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:30:43.139486 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-7twhd" event={"ID":"a11ab57b-145c-4043-bbce-507e3d1017ec","Type":"ContainerStarted","Data":"6fb48417145dde8358e03e2d5941c785f3accec0f12f1d56ada89a60c4d39361"} Apr 24 21:31:02.133576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.133536 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-965bb6f75-dsbwk" podUID="b828e5eb-8475-4d63-8481-b50fc2a13370" containerName="console" containerID="cri-o://ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86" gracePeriod=15 Apr 24 21:31:02.405054 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.405032 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-965bb6f75-dsbwk_b828e5eb-8475-4d63-8481-b50fc2a13370/console/0.log" Apr 24 21:31:02.405165 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.405093 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:31:02.489131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.489101 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-console-config\") pod \"b828e5eb-8475-4d63-8481-b50fc2a13370\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " Apr 24 21:31:02.489298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.489147 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdqvt\" (UniqueName: \"kubernetes.io/projected/b828e5eb-8475-4d63-8481-b50fc2a13370-kube-api-access-zdqvt\") pod \"b828e5eb-8475-4d63-8481-b50fc2a13370\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " Apr 24 21:31:02.489298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.489174 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-oauth-config\") pod \"b828e5eb-8475-4d63-8481-b50fc2a13370\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " Apr 24 21:31:02.489298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.489221 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-serving-cert\") pod \"b828e5eb-8475-4d63-8481-b50fc2a13370\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " Apr 24 21:31:02.489462 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.489307 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-service-ca\") pod \"b828e5eb-8475-4d63-8481-b50fc2a13370\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " Apr 24 21:31:02.489462 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.489404 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-oauth-serving-cert\") pod \"b828e5eb-8475-4d63-8481-b50fc2a13370\" (UID: \"b828e5eb-8475-4d63-8481-b50fc2a13370\") " Apr 24 21:31:02.489710 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.489680 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-service-ca" (OuterVolumeSpecName: "service-ca") pod "b828e5eb-8475-4d63-8481-b50fc2a13370" (UID: "b828e5eb-8475-4d63-8481-b50fc2a13370"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.489710 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.489693 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-console-config" (OuterVolumeSpecName: "console-config") pod "b828e5eb-8475-4d63-8481-b50fc2a13370" (UID: "b828e5eb-8475-4d63-8481-b50fc2a13370"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.489860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.489729 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b828e5eb-8475-4d63-8481-b50fc2a13370" (UID: "b828e5eb-8475-4d63-8481-b50fc2a13370"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.491473 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.491451 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b828e5eb-8475-4d63-8481-b50fc2a13370" (UID: "b828e5eb-8475-4d63-8481-b50fc2a13370"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.491727 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.491706 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b828e5eb-8475-4d63-8481-b50fc2a13370-kube-api-access-zdqvt" (OuterVolumeSpecName: "kube-api-access-zdqvt") pod "b828e5eb-8475-4d63-8481-b50fc2a13370" (UID: "b828e5eb-8475-4d63-8481-b50fc2a13370"). InnerVolumeSpecName "kube-api-access-zdqvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:02.491727 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.491713 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b828e5eb-8475-4d63-8481-b50fc2a13370" (UID: "b828e5eb-8475-4d63-8481-b50fc2a13370"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.590342 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.590303 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdqvt\" (UniqueName: \"kubernetes.io/projected/b828e5eb-8475-4d63-8481-b50fc2a13370-kube-api-access-zdqvt\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.590342 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.590335 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-oauth-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.590342 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.590345 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b828e5eb-8475-4d63-8481-b50fc2a13370-console-serving-cert\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.590565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.590354 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-service-ca\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.590565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.590363 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-oauth-serving-cert\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.590565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:02.590372 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b828e5eb-8475-4d63-8481-b50fc2a13370-console-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:03.198370 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.198341 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-965bb6f75-dsbwk_b828e5eb-8475-4d63-8481-b50fc2a13370/console/0.log" Apr 24 21:31:03.198829 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.198402 2573 generic.go:358] "Generic (PLEG): container finished" podID="b828e5eb-8475-4d63-8481-b50fc2a13370" containerID="ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86" exitCode=2 Apr 24 21:31:03.198829 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.198464 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-965bb6f75-dsbwk" Apr 24 21:31:03.198829 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.198478 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-965bb6f75-dsbwk" event={"ID":"b828e5eb-8475-4d63-8481-b50fc2a13370","Type":"ContainerDied","Data":"ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86"} Apr 24 21:31:03.198829 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.198521 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-965bb6f75-dsbwk" event={"ID":"b828e5eb-8475-4d63-8481-b50fc2a13370","Type":"ContainerDied","Data":"1786f7d406a718703954dd37ff4d4ae90ffdf9d6f7cd5fc3ce2e6200306cfa6c"} Apr 24 21:31:03.198829 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.198541 2573 scope.go:117] "RemoveContainer" containerID="ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86" Apr 24 21:31:03.206746 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.206731 2573 scope.go:117] "RemoveContainer" containerID="ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86" Apr 24 21:31:03.207009 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:31:03.206987 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86\": container with ID starting with ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86 not found: ID does not exist" containerID="ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86" Apr 24 21:31:03.207113 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.207015 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86"} err="failed to get container status \"ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86\": rpc error: code = NotFound desc = could not find container \"ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86\": container with ID starting with ef1f2e40b38fd74a5dfb8ef32f409504fe7c07c04183cf831a44582d87bc2f86 not found: ID does not exist" Apr 24 21:31:03.219537 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.219509 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-965bb6f75-dsbwk"] Apr 24 21:31:03.223299 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.223275 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-965bb6f75-dsbwk"] Apr 24 21:31:03.367466 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:03.367435 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b828e5eb-8475-4d63-8481-b50fc2a13370" path="/var/lib/kubelet/pods/b828e5eb-8475-4d63-8481-b50fc2a13370/volumes" Apr 24 21:31:06.121524 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:06.121481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:31:06.123733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:06.123708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932901de-5edd-4054-b5df-89077b36dd14-metrics-certs\") pod \"network-metrics-daemon-jrhlr\" (UID: \"932901de-5edd-4054-b5df-89077b36dd14\") " pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:31:06.267697 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:06.267664 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5kts7\"" Apr 24 21:31:06.275733 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:06.275709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jrhlr" Apr 24 21:31:06.399847 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:06.399821 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jrhlr"] Apr 24 21:31:06.402251 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:31:06.402224 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod932901de_5edd_4054_b5df_89077b36dd14.slice/crio-2c65fa1383c0737d00125c10fc92a7edeabf98cd1a513fc03fc30b8570aed188 WatchSource:0}: Error finding container 2c65fa1383c0737d00125c10fc92a7edeabf98cd1a513fc03fc30b8570aed188: Status 404 returned error can't find the container with id 2c65fa1383c0737d00125c10fc92a7edeabf98cd1a513fc03fc30b8570aed188 Apr 24 21:31:07.215316 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:07.215276 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jrhlr" event={"ID":"932901de-5edd-4054-b5df-89077b36dd14","Type":"ContainerStarted","Data":"2c65fa1383c0737d00125c10fc92a7edeabf98cd1a513fc03fc30b8570aed188"} Apr 24 21:31:08.220105 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.220066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jrhlr" event={"ID":"932901de-5edd-4054-b5df-89077b36dd14","Type":"ContainerStarted","Data":"f9091239c93fcf9f26eab17dbbb575d4054338680095ef4b033751378fffa94a"} Apr 24 21:31:08.220479 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.220111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jrhlr" event={"ID":"932901de-5edd-4054-b5df-89077b36dd14","Type":"ContainerStarted","Data":"224cfce409e5c9801fee0fa3971a4b4d0bedff2f62563cfd9f5fe42afca00f6c"} Apr 24 21:31:08.230701 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.230673 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:08.231130 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.231091 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="alertmanager" containerID="cri-o://77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401" gracePeriod=120 Apr 24 21:31:08.231130 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.231109 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy" containerID="cri-o://eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d" gracePeriod=120 Apr 24 21:31:08.231318 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.231109 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy-web" containerID="cri-o://8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2" gracePeriod=120 Apr 24 21:31:08.231318 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.231146 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="config-reloader" containerID="cri-o://228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b" gracePeriod=120 Apr 24 21:31:08.231318 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.231167 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy-metric" containerID="cri-o://8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06" gracePeriod=120 Apr 24 21:31:08.231318 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.231131 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="prom-label-proxy" containerID="cri-o://7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4" gracePeriod=120 Apr 24 21:31:08.264087 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:08.264032 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jrhlr" podStartSLOduration=251.860176701 podStartE2EDuration="4m13.264015209s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:31:06.403945306 +0000 UTC m=+251.683793065" lastFinishedPulling="2026-04-24 21:31:07.807783812 +0000 UTC m=+253.087631573" observedRunningTime="2026-04-24 21:31:08.262860035 +0000 UTC m=+253.542707815" watchObservedRunningTime="2026-04-24 21:31:08.264015209 +0000 UTC m=+253.543862989" Apr 24 21:31:09.225452 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.225416 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e893786-b818-4a95-b817-be883b718d44" containerID="7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4" exitCode=0 Apr 24 21:31:09.225452 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.225443 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e893786-b818-4a95-b817-be883b718d44" containerID="eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d" exitCode=0 Apr 24 21:31:09.225452 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.225450 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e893786-b818-4a95-b817-be883b718d44" containerID="228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b" exitCode=0 Apr 24 21:31:09.225452 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.225456 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e893786-b818-4a95-b817-be883b718d44" containerID="77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401" exitCode=0 Apr 24 21:31:09.225452 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.225447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerDied","Data":"7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4"} Apr 24 21:31:09.225968 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.225482 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerDied","Data":"eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d"} Apr 24 21:31:09.225968 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.225493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerDied","Data":"228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b"} Apr 24 21:31:09.225968 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.225502 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerDied","Data":"77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401"} Apr 24 21:31:09.472207 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.472182 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:09.551197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551170 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-metric\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551197 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551201 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-main-tls\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551437 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551221 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-cluster-tls-config\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551437 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551237 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-config-volume\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551437 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551258 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-config-out\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551437 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551293 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-web-config\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551437 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551318 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551437 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551360 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-metrics-client-ca\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551437 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551407 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-tls-assets\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551456 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-web\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551489 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-alertmanager-trusted-ca-bundle\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551545 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-alertmanager-main-db\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.551776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551577 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z44w8\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-kube-api-access-z44w8\") pod \"3e893786-b818-4a95-b817-be883b718d44\" (UID: \"3e893786-b818-4a95-b817-be883b718d44\") " Apr 24 21:31:09.552024 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.551890 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:09.553388 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.553356 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:09.553605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.553579 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:09.554527 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.554496 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-kube-api-access-z44w8" (OuterVolumeSpecName: "kube-api-access-z44w8") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "kube-api-access-z44w8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:09.554997 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.554900 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:09.555106 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.555013 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:09.555106 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.555069 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-config-out" (OuterVolumeSpecName: "config-out") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:09.555106 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.555072 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:09.555345 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.555314 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:09.555869 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.555845 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:09.556673 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.556650 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:09.559582 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.559561 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:09.565790 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.565767 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-web-config" (OuterVolumeSpecName: "web-config") pod "3e893786-b818-4a95-b817-be883b718d44" (UID: "3e893786-b818-4a95-b817-be883b718d44"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:09.653163 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653132 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653163 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653159 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-main-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653163 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653170 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-cluster-tls-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653180 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-config-volume\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653188 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-config-out\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653196 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-web-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653205 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653215 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-metrics-client-ca\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653223 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-tls-assets\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653233 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e893786-b818-4a95-b817-be883b718d44-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653243 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e893786-b818-4a95-b817-be883b718d44-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653253 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3e893786-b818-4a95-b817-be883b718d44-alertmanager-main-db\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:09.653383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:09.653262 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z44w8\" (UniqueName: \"kubernetes.io/projected/3e893786-b818-4a95-b817-be883b718d44-kube-api-access-z44w8\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:10.230955 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.230901 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e893786-b818-4a95-b817-be883b718d44" containerID="8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06" exitCode=0 Apr 24 21:31:10.230955 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.230946 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e893786-b818-4a95-b817-be883b718d44" containerID="8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2" exitCode=0 Apr 24 21:31:10.231444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.230974 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerDied","Data":"8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06"} Apr 24 21:31:10.231444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.231010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerDied","Data":"8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2"} Apr 24 21:31:10.231444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.231025 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e893786-b818-4a95-b817-be883b718d44","Type":"ContainerDied","Data":"853c34b4281956781390d459180efff38f8bdd1cfb0fa9efd28b07498e23fd68"} Apr 24 21:31:10.231444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.231042 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.231444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.231052 2573 scope.go:117] "RemoveContainer" containerID="7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4" Apr 24 21:31:10.238594 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.238575 2573 scope.go:117] "RemoveContainer" containerID="8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06" Apr 24 21:31:10.245640 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.245622 2573 scope.go:117] "RemoveContainer" containerID="eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d" Apr 24 21:31:10.251973 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.251957 2573 scope.go:117] "RemoveContainer" containerID="8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2" Apr 24 21:31:10.258429 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.258407 2573 scope.go:117] "RemoveContainer" containerID="228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b" Apr 24 21:31:10.258798 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.258776 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:10.265280 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.265200 2573 scope.go:117] "RemoveContainer" containerID="77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401" Apr 24 21:31:10.266727 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.266697 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:10.272619 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.272603 2573 scope.go:117] "RemoveContainer" containerID="52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384" Apr 24 21:31:10.278659 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.278645 2573 scope.go:117] "RemoveContainer" containerID="7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4" Apr 24 21:31:10.278933 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:31:10.278900 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4\": container with ID starting with 7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4 not found: ID does not exist" containerID="7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4" Apr 24 21:31:10.278986 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.278937 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4"} err="failed to get container status \"7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4\": rpc error: code = NotFound desc = could not find container \"7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4\": container with ID starting with 7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4 not found: ID does not exist" Apr 24 21:31:10.278986 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.278958 2573 scope.go:117] "RemoveContainer" containerID="8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06" Apr 24 21:31:10.279214 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:31:10.279196 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06\": container with ID starting with 8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06 not found: ID does not exist" containerID="8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06" Apr 24 21:31:10.279262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.279219 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06"} err="failed to get container status \"8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06\": rpc error: code = NotFound desc = could not find container \"8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06\": container with ID starting with 8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06 not found: ID does not exist" Apr 24 21:31:10.279262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.279235 2573 scope.go:117] "RemoveContainer" containerID="eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d" Apr 24 21:31:10.279450 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:31:10.279436 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d\": container with ID starting with eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d not found: ID does not exist" containerID="eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d" Apr 24 21:31:10.279496 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.279453 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d"} err="failed to get container status \"eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d\": rpc error: code = NotFound desc = could not find container \"eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d\": container with ID starting with eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d not found: ID does not exist" Apr 24 21:31:10.279496 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.279464 2573 scope.go:117] "RemoveContainer" containerID="8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2" Apr 24 21:31:10.279651 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:31:10.279636 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2\": container with ID starting with 8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2 not found: ID does not exist" containerID="8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2" Apr 24 21:31:10.279685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.279654 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2"} err="failed to get container status \"8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2\": rpc error: code = NotFound desc = could not find container \"8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2\": container with ID starting with 8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2 not found: ID does not exist" Apr 24 21:31:10.279685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.279665 2573 scope.go:117] "RemoveContainer" containerID="228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b" Apr 24 21:31:10.279849 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:31:10.279834 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b\": container with ID starting with 228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b not found: ID does not exist" containerID="228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b" Apr 24 21:31:10.279888 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.279852 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b"} err="failed to get container status \"228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b\": rpc error: code = NotFound desc = could not find container \"228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b\": container with ID starting with 228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b not found: ID does not exist" Apr 24 21:31:10.279888 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.279866 2573 scope.go:117] "RemoveContainer" containerID="77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401" Apr 24 21:31:10.280082 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:31:10.280066 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401\": container with ID starting with 77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401 not found: ID does not exist" containerID="77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401" Apr 24 21:31:10.280127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.280087 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401"} err="failed to get container status \"77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401\": rpc error: code = NotFound desc = could not find container \"77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401\": container with ID starting with 77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401 not found: ID does not exist" Apr 24 21:31:10.280127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.280108 2573 scope.go:117] "RemoveContainer" containerID="52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384" Apr 24 21:31:10.280335 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:31:10.280319 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384\": container with ID starting with 52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384 not found: ID does not exist" containerID="52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384" Apr 24 21:31:10.280390 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.280338 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384"} err="failed to get container status \"52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384\": rpc error: code = NotFound desc = could not find container \"52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384\": container with ID starting with 52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384 not found: ID does not exist" Apr 24 21:31:10.280390 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.280362 2573 scope.go:117] "RemoveContainer" containerID="7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4" Apr 24 21:31:10.280604 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.280586 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4"} err="failed to get container status \"7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4\": rpc error: code = NotFound desc = could not find container \"7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4\": container with ID starting with 7b609a25797d6cb816a4e4f6ed89ed5f99c55c2790eb834ea886f1a1684c47c4 not found: ID does not exist" Apr 24 21:31:10.280649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.280605 2573 scope.go:117] "RemoveContainer" containerID="8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06" Apr 24 21:31:10.280829 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.280810 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06"} err="failed to get container status \"8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06\": rpc error: code = NotFound desc = could not find container \"8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06\": container with ID starting with 8adf562058d379bce124addce23c079209c3757a9af23022b5675af624dbeb06 not found: ID does not exist" Apr 24 21:31:10.280879 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.280830 2573 scope.go:117] "RemoveContainer" containerID="eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d" Apr 24 21:31:10.281057 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.281040 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d"} err="failed to get container status \"eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d\": rpc error: code = NotFound desc = could not find container \"eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d\": container with ID starting with eb3e8df794b2fdb5dff25bc80d7ef3cecac95db91cad4196b809271475b6fe5d not found: ID does not exist" Apr 24 21:31:10.281115 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.281057 2573 scope.go:117] "RemoveContainer" containerID="8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2" Apr 24 21:31:10.281227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.281213 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2"} err="failed to get container status \"8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2\": rpc error: code = NotFound desc = could not find container \"8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2\": container with ID starting with 8325446880eefc479b0e1cbae20c9bcdc570c7aecaeb45ae795e8645788c02c2 not found: ID does not exist" Apr 24 21:31:10.281270 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.281226 2573 scope.go:117] "RemoveContainer" containerID="228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b" Apr 24 21:31:10.281404 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.281388 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b"} err="failed to get container status \"228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b\": rpc error: code = NotFound desc = could not find container \"228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b\": container with ID starting with 228b2601f11f1a7448a03fca5dc9197ecb614ca91431f3088c6f9356a9b8639b not found: ID does not exist" Apr 24 21:31:10.281444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.281404 2573 scope.go:117] "RemoveContainer" containerID="77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401" Apr 24 21:31:10.281631 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.281613 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401"} err="failed to get container status \"77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401\": rpc error: code = NotFound desc = could not find container \"77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401\": container with ID starting with 77eb983406332cf95520b2773469798689bc667fab351057f968de6b3117b401 not found: ID does not exist" Apr 24 21:31:10.281681 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.281632 2573 scope.go:117] "RemoveContainer" containerID="52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384" Apr 24 21:31:10.281862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.281845 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384"} err="failed to get container status \"52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384\": rpc error: code = NotFound desc = could not find container \"52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384\": container with ID starting with 52640768bd94be7a43950aed5658d34624e41a4bb6a559bac8f2e4bdd4026384 not found: ID does not exist" Apr 24 21:31:10.306770 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.306744 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:10.307069 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307056 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="prom-label-proxy" Apr 24 21:31:10.307116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307071 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="prom-label-proxy" Apr 24 21:31:10.307116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307086 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy" Apr 24 21:31:10.307116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307092 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy" Apr 24 21:31:10.307116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307101 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy-web" Apr 24 21:31:10.307116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307107 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy-web" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307116 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="config-reloader" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307125 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="config-reloader" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307136 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b828e5eb-8475-4d63-8481-b50fc2a13370" containerName="console" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307141 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b828e5eb-8475-4d63-8481-b50fc2a13370" containerName="console" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307149 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="init-config-reloader" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307154 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="init-config-reloader" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307164 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy-metric" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307169 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy-metric" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307177 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="alertmanager" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307182 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="alertmanager" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307231 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="prom-label-proxy" Apr 24 21:31:10.307262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307240 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy-metric" Apr 24 21:31:10.307584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307270 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy-web" Apr 24 21:31:10.307584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307281 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="alertmanager" Apr 24 21:31:10.307584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307290 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="config-reloader" Apr 24 21:31:10.307584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307297 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e893786-b818-4a95-b817-be883b718d44" containerName="kube-rbac-proxy" Apr 24 21:31:10.307584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.307304 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b828e5eb-8475-4d63-8481-b50fc2a13370" containerName="console" Apr 24 21:31:10.310983 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.310967 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.314114 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.314094 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-94xsz\"" Apr 24 21:31:10.314365 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.314349 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:31:10.314752 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.314654 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:31:10.314752 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.314689 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:31:10.314752 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.314696 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:31:10.314752 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.314713 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:31:10.315048 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.314713 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:31:10.315048 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.314719 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:31:10.315048 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.314911 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:31:10.322391 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.322372 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:31:10.329442 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.329419 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:10.359632 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-config-volume\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.359632 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359634 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.359817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359653 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8f4f9a6-088c-4e63-bab7-672714f996a1-config-out\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.359817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.359817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.359940 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359817 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f4f9a6-088c-4e63-bab7-672714f996a1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.359940 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.359940 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-web-config\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.359940 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9hpr\" (UniqueName: \"kubernetes.io/projected/b8f4f9a6-088c-4e63-bab7-672714f996a1-kube-api-access-s9hpr\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.360063 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359943 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.360063 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8f4f9a6-088c-4e63-bab7-672714f996a1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.360063 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.359985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8f4f9a6-088c-4e63-bab7-672714f996a1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.360063 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.360000 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b8f4f9a6-088c-4e63-bab7-672714f996a1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.460861 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.460820 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.460861 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.460860 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f4f9a6-088c-4e63-bab7-672714f996a1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.460880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.460897 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-web-config\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.460939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9hpr\" (UniqueName: \"kubernetes.io/projected/b8f4f9a6-088c-4e63-bab7-672714f996a1-kube-api-access-s9hpr\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.460964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.460988 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8f4f9a6-088c-4e63-bab7-672714f996a1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.461009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8f4f9a6-088c-4e63-bab7-672714f996a1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.461026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b8f4f9a6-088c-4e63-bab7-672714f996a1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.461085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-config-volume\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.461113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461142 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.461140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8f4f9a6-088c-4e63-bab7-672714f996a1-config-out\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461592 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.461186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461592 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.461525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b8f4f9a6-088c-4e63-bab7-672714f996a1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.461889 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.461853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f4f9a6-088c-4e63-bab7-672714f996a1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.462014 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.461853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8f4f9a6-088c-4e63-bab7-672714f996a1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.464243 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.464200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.464358 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.464298 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-web-config\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.464414 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.464367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.464616 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.464589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.464757 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.464738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8f4f9a6-088c-4e63-bab7-672714f996a1-config-out\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.464819 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.464764 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8f4f9a6-088c-4e63-bab7-672714f996a1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.464932 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.464896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.465047 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.465025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.466105 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.466089 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8f4f9a6-088c-4e63-bab7-672714f996a1-config-volume\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.471441 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.471418 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9hpr\" (UniqueName: \"kubernetes.io/projected/b8f4f9a6-088c-4e63-bab7-672714f996a1-kube-api-access-s9hpr\") pod \"alertmanager-main-0\" (UID: \"b8f4f9a6-088c-4e63-bab7-672714f996a1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.619466 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.619425 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:10.746175 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:10.746146 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:10.748217 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:31:10.748193 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8f4f9a6_088c_4e63_bab7_672714f996a1.slice/crio-c5abd5622948cd62700fa4ff00739e4a3c65e8aff02505dfeef181f3312440ab WatchSource:0}: Error finding container c5abd5622948cd62700fa4ff00739e4a3c65e8aff02505dfeef181f3312440ab: Status 404 returned error can't find the container with id c5abd5622948cd62700fa4ff00739e4a3c65e8aff02505dfeef181f3312440ab Apr 24 21:31:11.235349 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:11.235316 2573 generic.go:358] "Generic (PLEG): container finished" podID="b8f4f9a6-088c-4e63-bab7-672714f996a1" containerID="fb759cc76998f652e9460b84452963fbad1f9222c3bf98a1027fe62a59729b26" exitCode=0 Apr 24 21:31:11.235713 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:11.235384 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8f4f9a6-088c-4e63-bab7-672714f996a1","Type":"ContainerDied","Data":"fb759cc76998f652e9460b84452963fbad1f9222c3bf98a1027fe62a59729b26"} Apr 24 21:31:11.235713 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:11.235404 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8f4f9a6-088c-4e63-bab7-672714f996a1","Type":"ContainerStarted","Data":"c5abd5622948cd62700fa4ff00739e4a3c65e8aff02505dfeef181f3312440ab"} Apr 24 21:31:11.367688 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:11.367661 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e893786-b818-4a95-b817-be883b718d44" path="/var/lib/kubelet/pods/3e893786-b818-4a95-b817-be883b718d44/volumes" Apr 24 21:31:12.241279 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.241235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8f4f9a6-088c-4e63-bab7-672714f996a1","Type":"ContainerStarted","Data":"65c54048c2a683e0acd03c55dd56db7a45c8baaff0ff9cc8b7875a605d341ca2"} Apr 24 21:31:12.241279 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.241284 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8f4f9a6-088c-4e63-bab7-672714f996a1","Type":"ContainerStarted","Data":"585dc606ea1be41e43031ccbaae7381cbcbc6b3a79012cb92d805721f6059f6f"} Apr 24 21:31:12.241784 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.241298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8f4f9a6-088c-4e63-bab7-672714f996a1","Type":"ContainerStarted","Data":"45e5a87ce2c7db03849c2c253bbf2a4f5efa1a81d40cd5a76d65e01295cc8650"} Apr 24 21:31:12.241784 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.241310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8f4f9a6-088c-4e63-bab7-672714f996a1","Type":"ContainerStarted","Data":"2b9c3477e5e349d05d907960781c2dde03e48885ff0f4633db555e88a12c0a97"} Apr 24 21:31:12.241784 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.241328 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8f4f9a6-088c-4e63-bab7-672714f996a1","Type":"ContainerStarted","Data":"752bfa0a12fc281d1efc3f9d0080a0cb95fde19abbd0b4b35a58de27152ebb9b"} Apr 24 21:31:12.241784 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.241341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8f4f9a6-088c-4e63-bab7-672714f996a1","Type":"ContainerStarted","Data":"895fbf5c3cf6393604415d811d9e2aa07788d0c13fc1a9dfbd1153d275eb5098"} Apr 24 21:31:12.295239 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.295175 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.295155849 podStartE2EDuration="2.295155849s" podCreationTimestamp="2026-04-24 21:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:12.292985456 +0000 UTC m=+257.572833236" watchObservedRunningTime="2026-04-24 21:31:12.295155849 +0000 UTC m=+257.575003629" Apr 24 21:31:12.385909 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.385873 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-559ff94dc9-gj59f"] Apr 24 21:31:12.388339 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.388321 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.396107 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.396085 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5q56s\"" Apr 24 21:31:12.396209 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.396085 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 21:31:12.396209 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.396085 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 21:31:12.396209 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.396111 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 21:31:12.396592 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.396571 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 21:31:12.396638 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.396624 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 21:31:12.405745 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.405722 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-559ff94dc9-gj59f"] Apr 24 21:31:12.406982 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.406956 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 21:31:12.481542 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.481505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-secret-telemeter-client\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.481542 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.481538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2fd1f8-630e-43fa-ae70-146673d2898e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.481729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.481570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f2fd1f8-630e-43fa-ae70-146673d2898e-metrics-client-ca\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.481729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.481647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2fd1f8-630e-43fa-ae70-146673d2898e-serving-certs-ca-bundle\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.481729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.481702 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-telemeter-client-tls\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.481828 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.481730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jm7q\" (UniqueName: \"kubernetes.io/projected/4f2fd1f8-630e-43fa-ae70-146673d2898e-kube-api-access-4jm7q\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.481828 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.481781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-federate-client-tls\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.481883 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.481837 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.582400 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.582368 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2fd1f8-630e-43fa-ae70-146673d2898e-serving-certs-ca-bundle\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.582557 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.582411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-telemeter-client-tls\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.582557 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.582431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jm7q\" (UniqueName: \"kubernetes.io/projected/4f2fd1f8-630e-43fa-ae70-146673d2898e-kube-api-access-4jm7q\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.582557 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.582457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-federate-client-tls\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.582557 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.582521 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.582742 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.582564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-secret-telemeter-client\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.582742 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.582590 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2fd1f8-630e-43fa-ae70-146673d2898e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.582742 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.582623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f2fd1f8-630e-43fa-ae70-146673d2898e-metrics-client-ca\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.583210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.583180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2fd1f8-630e-43fa-ae70-146673d2898e-serving-certs-ca-bundle\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.583330 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.583306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f2fd1f8-630e-43fa-ae70-146673d2898e-metrics-client-ca\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.583707 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.583685 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2fd1f8-630e-43fa-ae70-146673d2898e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.585203 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.585178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-secret-telemeter-client\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.585295 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.585235 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-federate-client-tls\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.585401 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.585384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.585434 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.585384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4f2fd1f8-630e-43fa-ae70-146673d2898e-telemeter-client-tls\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.591167 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.591143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jm7q\" (UniqueName: \"kubernetes.io/projected/4f2fd1f8-630e-43fa-ae70-146673d2898e-kube-api-access-4jm7q\") pod \"telemeter-client-559ff94dc9-gj59f\" (UID: \"4f2fd1f8-630e-43fa-ae70-146673d2898e\") " pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.698280 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.698240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" Apr 24 21:31:12.849179 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:12.849105 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-559ff94dc9-gj59f"] Apr 24 21:31:12.858650 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:31:12.858624 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2fd1f8_630e_43fa_ae70_146673d2898e.slice/crio-eac993dab5f1be800a7f432e435334728b966db9d9badc96db532223b3b6ce4e WatchSource:0}: Error finding container eac993dab5f1be800a7f432e435334728b966db9d9badc96db532223b3b6ce4e: Status 404 returned error can't find the container with id eac993dab5f1be800a7f432e435334728b966db9d9badc96db532223b3b6ce4e Apr 24 21:31:13.245542 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:13.245457 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" event={"ID":"4f2fd1f8-630e-43fa-ae70-146673d2898e","Type":"ContainerStarted","Data":"eac993dab5f1be800a7f432e435334728b966db9d9badc96db532223b3b6ce4e"} Apr 24 21:31:15.254399 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:15.254363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" event={"ID":"4f2fd1f8-630e-43fa-ae70-146673d2898e","Type":"ContainerStarted","Data":"953e897cb4515a55ac14697ee62524aba0a43bb5b490e3167e6fb779992dde10"} Apr 24 21:31:15.254399 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:15.254399 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" event={"ID":"4f2fd1f8-630e-43fa-ae70-146673d2898e","Type":"ContainerStarted","Data":"f0ca8115098f305f4d6ad44446b7ba3018a0eb71bdd54d14debcac365c89d797"} Apr 24 21:31:15.254399 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:15.254409 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" event={"ID":"4f2fd1f8-630e-43fa-ae70-146673d2898e","Type":"ContainerStarted","Data":"580dd141f2751d1d37560c097647460eb33c62f7dc62892a5df1eb8a9aef33f0"} Apr 24 21:31:15.279262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:15.279201 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-559ff94dc9-gj59f" podStartSLOduration=1.598212819 podStartE2EDuration="3.279181985s" podCreationTimestamp="2026-04-24 21:31:12 +0000 UTC" firstStartedPulling="2026-04-24 21:31:12.860408929 +0000 UTC m=+258.140256691" lastFinishedPulling="2026-04-24 21:31:14.541378099 +0000 UTC m=+259.821225857" observedRunningTime="2026-04-24 21:31:15.278006646 +0000 UTC m=+260.557854451" watchObservedRunningTime="2026-04-24 21:31:15.279181985 +0000 UTC m=+260.559029766" Apr 24 21:31:16.052972 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.052939 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76658cc775-q5jn2"] Apr 24 21:31:16.055276 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.055258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.073762 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.073738 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76658cc775-q5jn2"] Apr 24 21:31:16.115436 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.115404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-oauth-config\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.115436 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.115435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-service-ca\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.115667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.115459 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-serving-cert\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.115667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.115513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-oauth-serving-cert\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.115667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.115568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfrzp\" (UniqueName: \"kubernetes.io/projected/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-kube-api-access-cfrzp\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.115667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.115613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-trusted-ca-bundle\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.115667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.115635 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-config\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.217091 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.217046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-oauth-config\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.217260 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.217089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-service-ca\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.217260 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.217191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-serving-cert\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.217260 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.217212 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-oauth-serving-cert\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.217390 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.217259 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfrzp\" (UniqueName: \"kubernetes.io/projected/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-kube-api-access-cfrzp\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.217390 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.217322 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-trusted-ca-bundle\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.217390 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.217353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-config\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.217909 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.217878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-service-ca\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.218070 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.218043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-config\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.218368 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.218343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-trusted-ca-bundle\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.218565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.218545 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-oauth-serving-cert\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.219636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.219614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-oauth-config\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.219723 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.219637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-serving-cert\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.230478 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.230455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfrzp\" (UniqueName: \"kubernetes.io/projected/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-kube-api-access-cfrzp\") pod \"console-76658cc775-q5jn2\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.364855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.364777 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:16.694405 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:16.694323 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76658cc775-q5jn2"] Apr 24 21:31:16.710996 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:31:16.710966 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da526c5_c9ab_4cc2_a20e_aae01bf41b13.slice/crio-39c874d83b1b23e7df05a7a80072e9fb8d9bec98b9351306f176bc8a3172a00f WatchSource:0}: Error finding container 39c874d83b1b23e7df05a7a80072e9fb8d9bec98b9351306f176bc8a3172a00f: Status 404 returned error can't find the container with id 39c874d83b1b23e7df05a7a80072e9fb8d9bec98b9351306f176bc8a3172a00f Apr 24 21:31:17.262269 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:17.262231 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76658cc775-q5jn2" event={"ID":"5da526c5-c9ab-4cc2-a20e-aae01bf41b13","Type":"ContainerStarted","Data":"d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5"} Apr 24 21:31:17.262269 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:17.262268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76658cc775-q5jn2" event={"ID":"5da526c5-c9ab-4cc2-a20e-aae01bf41b13","Type":"ContainerStarted","Data":"39c874d83b1b23e7df05a7a80072e9fb8d9bec98b9351306f176bc8a3172a00f"} Apr 24 21:31:17.282344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:17.282287 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76658cc775-q5jn2" podStartSLOduration=1.282270979 podStartE2EDuration="1.282270979s" podCreationTimestamp="2026-04-24 21:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:17.280997663 +0000 UTC m=+262.560845442" watchObservedRunningTime="2026-04-24 21:31:17.282270979 +0000 UTC m=+262.562118760" Apr 24 21:31:26.365127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:26.365089 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:26.365520 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:26.365166 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:26.370186 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:26.370162 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:27.299253 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:27.299225 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:31:27.413215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:27.413183 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68f8758848-qqpg2"] Apr 24 21:31:52.431769 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.431729 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68f8758848-qqpg2" podUID="050ba746-02c0-4456-888d-5d244d48df3b" containerName="console" containerID="cri-o://04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af" gracePeriod=15 Apr 24 21:31:52.673332 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.673310 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68f8758848-qqpg2_050ba746-02c0-4456-888d-5d244d48df3b/console/0.log" Apr 24 21:31:52.673447 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.673371 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:31:52.735308 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735226 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-oauth-config\") pod \"050ba746-02c0-4456-888d-5d244d48df3b\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " Apr 24 21:31:52.735308 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735268 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-console-config\") pod \"050ba746-02c0-4456-888d-5d244d48df3b\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " Apr 24 21:31:52.735308 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735294 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-serving-cert\") pod \"050ba746-02c0-4456-888d-5d244d48df3b\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " Apr 24 21:31:52.735556 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735348 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-trusted-ca-bundle\") pod \"050ba746-02c0-4456-888d-5d244d48df3b\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " Apr 24 21:31:52.735556 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735370 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45pf6\" (UniqueName: \"kubernetes.io/projected/050ba746-02c0-4456-888d-5d244d48df3b-kube-api-access-45pf6\") pod \"050ba746-02c0-4456-888d-5d244d48df3b\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " Apr 24 21:31:52.735556 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735536 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-service-ca\") pod \"050ba746-02c0-4456-888d-5d244d48df3b\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " Apr 24 21:31:52.735706 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735588 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-oauth-serving-cert\") pod \"050ba746-02c0-4456-888d-5d244d48df3b\" (UID: \"050ba746-02c0-4456-888d-5d244d48df3b\") " Apr 24 21:31:52.735771 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735631 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-console-config" (OuterVolumeSpecName: "console-config") pod "050ba746-02c0-4456-888d-5d244d48df3b" (UID: "050ba746-02c0-4456-888d-5d244d48df3b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:52.735913 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735897 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-console-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:52.736022 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735939 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-service-ca" (OuterVolumeSpecName: "service-ca") pod "050ba746-02c0-4456-888d-5d244d48df3b" (UID: "050ba746-02c0-4456-888d-5d244d48df3b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:52.736022 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.735848 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "050ba746-02c0-4456-888d-5d244d48df3b" (UID: "050ba746-02c0-4456-888d-5d244d48df3b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:52.736099 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.736016 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "050ba746-02c0-4456-888d-5d244d48df3b" (UID: "050ba746-02c0-4456-888d-5d244d48df3b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:52.737609 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.737592 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "050ba746-02c0-4456-888d-5d244d48df3b" (UID: "050ba746-02c0-4456-888d-5d244d48df3b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:52.737900 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.737881 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "050ba746-02c0-4456-888d-5d244d48df3b" (UID: "050ba746-02c0-4456-888d-5d244d48df3b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:52.737900 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.737891 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050ba746-02c0-4456-888d-5d244d48df3b-kube-api-access-45pf6" (OuterVolumeSpecName: "kube-api-access-45pf6") pod "050ba746-02c0-4456-888d-5d244d48df3b" (UID: "050ba746-02c0-4456-888d-5d244d48df3b"). InnerVolumeSpecName "kube-api-access-45pf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:52.837070 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.837041 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-trusted-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:52.837070 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.837065 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-45pf6\" (UniqueName: \"kubernetes.io/projected/050ba746-02c0-4456-888d-5d244d48df3b-kube-api-access-45pf6\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:52.837070 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.837076 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-service-ca\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:52.837287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.837085 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/050ba746-02c0-4456-888d-5d244d48df3b-oauth-serving-cert\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:52.837287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.837093 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-oauth-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:52.837287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:52.837102 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/050ba746-02c0-4456-888d-5d244d48df3b-console-serving-cert\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:31:53.377606 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.377581 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68f8758848-qqpg2_050ba746-02c0-4456-888d-5d244d48df3b/console/0.log" Apr 24 21:31:53.377721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.377621 2573 generic.go:358] "Generic (PLEG): container finished" podID="050ba746-02c0-4456-888d-5d244d48df3b" containerID="04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af" exitCode=2 Apr 24 21:31:53.377721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.377706 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f8758848-qqpg2" Apr 24 21:31:53.377721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.377704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f8758848-qqpg2" event={"ID":"050ba746-02c0-4456-888d-5d244d48df3b","Type":"ContainerDied","Data":"04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af"} Apr 24 21:31:53.377843 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.377745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f8758848-qqpg2" event={"ID":"050ba746-02c0-4456-888d-5d244d48df3b","Type":"ContainerDied","Data":"0f184f9da294473c9a749a2c9606fcd166b597d6f8bab46b1214b8414b6d6359"} Apr 24 21:31:53.377843 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.377765 2573 scope.go:117] "RemoveContainer" containerID="04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af" Apr 24 21:31:53.385664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.385643 2573 scope.go:117] "RemoveContainer" containerID="04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af" Apr 24 21:31:53.385976 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:31:53.385954 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af\": container with ID starting with 04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af not found: ID does not exist" containerID="04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af" Apr 24 21:31:53.386075 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.385986 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af"} err="failed to get container status \"04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af\": rpc error: code = NotFound desc = could not find container \"04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af\": container with ID starting with 04615993efeec334bd10517bed38ed4a7627c775e6e817c6ba675a8a5ea1d6af not found: ID does not exist" Apr 24 21:31:53.395324 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.395300 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68f8758848-qqpg2"] Apr 24 21:31:53.398866 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:53.398833 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68f8758848-qqpg2"] Apr 24 21:31:55.235030 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:55.235007 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:31:55.235473 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:55.235007 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:31:55.242359 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:55.242337 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:31:55.367776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:31:55.367621 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="050ba746-02c0-4456-888d-5d244d48df3b" path="/var/lib/kubelet/pods/050ba746-02c0-4456-888d-5d244d48df3b/volumes" Apr 24 21:32:31.179249 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.179214 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75b584996b-2qpgz"] Apr 24 21:32:31.179702 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.179556 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="050ba746-02c0-4456-888d-5d244d48df3b" containerName="console" Apr 24 21:32:31.179702 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.179567 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="050ba746-02c0-4456-888d-5d244d48df3b" containerName="console" Apr 24 21:32:31.179702 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.179616 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="050ba746-02c0-4456-888d-5d244d48df3b" containerName="console" Apr 24 21:32:31.181576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.181559 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.201405 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.201376 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75b584996b-2qpgz"] Apr 24 21:32:31.262181 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.262133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkgwc\" (UniqueName: \"kubernetes.io/projected/d3dcfe4f-f129-4a28-98dd-db338ee798cd-kube-api-access-mkgwc\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.262181 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.262180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-oauth-serving-cert\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.262385 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.262224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-trusted-ca-bundle\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.262385 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.262261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-config\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.262385 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.262336 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-oauth-config\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.262385 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.262365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-service-ca\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.262546 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.262395 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-serving-cert\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.363604 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.363564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-oauth-config\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.363604 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.363603 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-service-ca\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.363855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.363629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-serving-cert\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.363855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.363685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkgwc\" (UniqueName: \"kubernetes.io/projected/d3dcfe4f-f129-4a28-98dd-db338ee798cd-kube-api-access-mkgwc\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.363855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.363703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-oauth-serving-cert\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.363855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.363729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-trusted-ca-bundle\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.363855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.363765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-config\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.364978 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.364459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-service-ca\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.364978 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.364478 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-config\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.364978 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.364616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-oauth-serving-cert\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.364978 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.364789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-trusted-ca-bundle\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.366274 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.366253 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-serving-cert\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.366386 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.366280 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-oauth-config\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.373081 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.373059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkgwc\" (UniqueName: \"kubernetes.io/projected/d3dcfe4f-f129-4a28-98dd-db338ee798cd-kube-api-access-mkgwc\") pod \"console-75b584996b-2qpgz\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.491161 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.491073 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:31.818133 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.818108 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75b584996b-2qpgz"] Apr 24 21:32:31.820498 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:32:31.820469 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3dcfe4f_f129_4a28_98dd_db338ee798cd.slice/crio-776181c1010f7d77a0254fadcbc7c1ebc46ccf3b0eec304bc6764df5328a1be7 WatchSource:0}: Error finding container 776181c1010f7d77a0254fadcbc7c1ebc46ccf3b0eec304bc6764df5328a1be7: Status 404 returned error can't find the container with id 776181c1010f7d77a0254fadcbc7c1ebc46ccf3b0eec304bc6764df5328a1be7 Apr 24 21:32:31.822370 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:31.822351 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:32:32.488064 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:32.488023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75b584996b-2qpgz" event={"ID":"d3dcfe4f-f129-4a28-98dd-db338ee798cd","Type":"ContainerStarted","Data":"3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528"} Apr 24 21:32:32.488064 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:32.488066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75b584996b-2qpgz" event={"ID":"d3dcfe4f-f129-4a28-98dd-db338ee798cd","Type":"ContainerStarted","Data":"776181c1010f7d77a0254fadcbc7c1ebc46ccf3b0eec304bc6764df5328a1be7"} Apr 24 21:32:32.521118 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:32.521074 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75b584996b-2qpgz" podStartSLOduration=1.521060018 podStartE2EDuration="1.521060018s" podCreationTimestamp="2026-04-24 21:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:32:32.519588821 +0000 UTC m=+337.799436611" watchObservedRunningTime="2026-04-24 21:32:32.521060018 +0000 UTC m=+337.800907797" Apr 24 21:32:41.492165 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:41.492115 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:41.492581 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:41.492266 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:41.497047 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:41.497020 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:41.522089 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:41.522066 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:32:41.580495 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:41.580462 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76658cc775-q5jn2"] Apr 24 21:33:06.600115 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.600015 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76658cc775-q5jn2" podUID="5da526c5-c9ab-4cc2-a20e-aae01bf41b13" containerName="console" containerID="cri-o://d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5" gracePeriod=15 Apr 24 21:33:06.834314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.834288 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76658cc775-q5jn2_5da526c5-c9ab-4cc2-a20e-aae01bf41b13/console/0.log" Apr 24 21:33:06.834419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.834351 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:33:06.982989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.982880 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-oauth-serving-cert\") pod \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " Apr 24 21:33:06.982989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.982933 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-service-ca\") pod \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " Apr 24 21:33:06.982989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.982962 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfrzp\" (UniqueName: \"kubernetes.io/projected/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-kube-api-access-cfrzp\") pod \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " Apr 24 21:33:06.983277 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.983020 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-serving-cert\") pod \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " Apr 24 21:33:06.983277 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.983042 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-oauth-config\") pod \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " Apr 24 21:33:06.983277 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.983058 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-config\") pod \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " Apr 24 21:33:06.983277 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.983089 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-trusted-ca-bundle\") pod \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\" (UID: \"5da526c5-c9ab-4cc2-a20e-aae01bf41b13\") " Apr 24 21:33:06.983481 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.983339 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5da526c5-c9ab-4cc2-a20e-aae01bf41b13" (UID: "5da526c5-c9ab-4cc2-a20e-aae01bf41b13"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:06.983481 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.983374 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-service-ca" (OuterVolumeSpecName: "service-ca") pod "5da526c5-c9ab-4cc2-a20e-aae01bf41b13" (UID: "5da526c5-c9ab-4cc2-a20e-aae01bf41b13"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:06.983669 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.983641 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-config" (OuterVolumeSpecName: "console-config") pod "5da526c5-c9ab-4cc2-a20e-aae01bf41b13" (UID: "5da526c5-c9ab-4cc2-a20e-aae01bf41b13"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:06.983784 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.983723 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5da526c5-c9ab-4cc2-a20e-aae01bf41b13" (UID: "5da526c5-c9ab-4cc2-a20e-aae01bf41b13"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:06.985342 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.985313 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5da526c5-c9ab-4cc2-a20e-aae01bf41b13" (UID: "5da526c5-c9ab-4cc2-a20e-aae01bf41b13"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:06.985582 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.985566 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5da526c5-c9ab-4cc2-a20e-aae01bf41b13" (UID: "5da526c5-c9ab-4cc2-a20e-aae01bf41b13"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:06.985677 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:06.985653 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-kube-api-access-cfrzp" (OuterVolumeSpecName: "kube-api-access-cfrzp") pod "5da526c5-c9ab-4cc2-a20e-aae01bf41b13" (UID: "5da526c5-c9ab-4cc2-a20e-aae01bf41b13"). InnerVolumeSpecName "kube-api-access-cfrzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:07.083984 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.083937 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-serving-cert\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:07.083984 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.083980 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-oauth-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:07.083984 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.083990 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-console-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:07.083984 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.084001 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-trusted-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:07.084237 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.084009 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-oauth-serving-cert\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:07.084237 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.084019 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-service-ca\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:07.084237 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.084028 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cfrzp\" (UniqueName: \"kubernetes.io/projected/5da526c5-c9ab-4cc2-a20e-aae01bf41b13-kube-api-access-cfrzp\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:07.592653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.592624 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76658cc775-q5jn2_5da526c5-c9ab-4cc2-a20e-aae01bf41b13/console/0.log" Apr 24 21:33:07.592820 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.592667 2573 generic.go:358] "Generic (PLEG): container finished" podID="5da526c5-c9ab-4cc2-a20e-aae01bf41b13" containerID="d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5" exitCode=2 Apr 24 21:33:07.592820 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.592700 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76658cc775-q5jn2" event={"ID":"5da526c5-c9ab-4cc2-a20e-aae01bf41b13","Type":"ContainerDied","Data":"d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5"} Apr 24 21:33:07.592820 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.592732 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76658cc775-q5jn2" Apr 24 21:33:07.592820 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.592745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76658cc775-q5jn2" event={"ID":"5da526c5-c9ab-4cc2-a20e-aae01bf41b13","Type":"ContainerDied","Data":"39c874d83b1b23e7df05a7a80072e9fb8d9bec98b9351306f176bc8a3172a00f"} Apr 24 21:33:07.592820 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.592764 2573 scope.go:117] "RemoveContainer" containerID="d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5" Apr 24 21:33:07.600408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.600392 2573 scope.go:117] "RemoveContainer" containerID="d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5" Apr 24 21:33:07.600691 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:33:07.600645 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5\": container with ID starting with d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5 not found: ID does not exist" containerID="d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5" Apr 24 21:33:07.600691 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.600666 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5"} err="failed to get container status \"d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5\": rpc error: code = NotFound desc = could not find container \"d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5\": container with ID starting with d9ade8b103ffab115cb39fe823139be3f3815488b0084b04254cb51b7bdafad5 not found: ID does not exist" Apr 24 21:33:07.614229 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.614204 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76658cc775-q5jn2"] Apr 24 21:33:07.619478 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:07.619454 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76658cc775-q5jn2"] Apr 24 21:33:09.367409 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:09.367374 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da526c5-c9ab-4cc2-a20e-aae01bf41b13" path="/var/lib/kubelet/pods/5da526c5-c9ab-4cc2-a20e-aae01bf41b13/volumes" Apr 24 21:33:23.977867 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:23.977833 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nk9jw"] Apr 24 21:33:23.978254 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:23.978179 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5da526c5-c9ab-4cc2-a20e-aae01bf41b13" containerName="console" Apr 24 21:33:23.978254 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:23.978196 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da526c5-c9ab-4cc2-a20e-aae01bf41b13" containerName="console" Apr 24 21:33:23.978336 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:23.978255 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5da526c5-c9ab-4cc2-a20e-aae01bf41b13" containerName="console" Apr 24 21:33:23.980136 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:23.980119 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:23.982334 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:23.982314 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:33:23.989481 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:23.989458 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nk9jw"] Apr 24 21:33:24.016223 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.016199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7e233db-788d-485b-815f-9cd371ff230e-original-pull-secret\") pod \"global-pull-secret-syncer-nk9jw\" (UID: \"b7e233db-788d-485b-815f-9cd371ff230e\") " pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.016333 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.016237 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b7e233db-788d-485b-815f-9cd371ff230e-kubelet-config\") pod \"global-pull-secret-syncer-nk9jw\" (UID: \"b7e233db-788d-485b-815f-9cd371ff230e\") " pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.016333 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.016274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b7e233db-788d-485b-815f-9cd371ff230e-dbus\") pod \"global-pull-secret-syncer-nk9jw\" (UID: \"b7e233db-788d-485b-815f-9cd371ff230e\") " pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.117554 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.117515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b7e233db-788d-485b-815f-9cd371ff230e-kubelet-config\") pod \"global-pull-secret-syncer-nk9jw\" (UID: \"b7e233db-788d-485b-815f-9cd371ff230e\") " pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.117671 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.117566 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b7e233db-788d-485b-815f-9cd371ff230e-dbus\") pod \"global-pull-secret-syncer-nk9jw\" (UID: \"b7e233db-788d-485b-815f-9cd371ff230e\") " pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.117671 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.117619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7e233db-788d-485b-815f-9cd371ff230e-original-pull-secret\") pod \"global-pull-secret-syncer-nk9jw\" (UID: \"b7e233db-788d-485b-815f-9cd371ff230e\") " pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.117671 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.117650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b7e233db-788d-485b-815f-9cd371ff230e-kubelet-config\") pod \"global-pull-secret-syncer-nk9jw\" (UID: \"b7e233db-788d-485b-815f-9cd371ff230e\") " pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.117774 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.117752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b7e233db-788d-485b-815f-9cd371ff230e-dbus\") pod \"global-pull-secret-syncer-nk9jw\" (UID: \"b7e233db-788d-485b-815f-9cd371ff230e\") " pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.119806 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.119789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7e233db-788d-485b-815f-9cd371ff230e-original-pull-secret\") pod \"global-pull-secret-syncer-nk9jw\" (UID: \"b7e233db-788d-485b-815f-9cd371ff230e\") " pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.289689 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.289655 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nk9jw" Apr 24 21:33:24.412168 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.412138 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nk9jw"] Apr 24 21:33:24.413845 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:33:24.413816 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e233db_788d_485b_815f_9cd371ff230e.slice/crio-5d4215802a0fd290a744307bd19d6033b190a1144b68439f1fa97c959204c734 WatchSource:0}: Error finding container 5d4215802a0fd290a744307bd19d6033b190a1144b68439f1fa97c959204c734: Status 404 returned error can't find the container with id 5d4215802a0fd290a744307bd19d6033b190a1144b68439f1fa97c959204c734 Apr 24 21:33:24.645998 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:24.645901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nk9jw" event={"ID":"b7e233db-788d-485b-815f-9cd371ff230e","Type":"ContainerStarted","Data":"5d4215802a0fd290a744307bd19d6033b190a1144b68439f1fa97c959204c734"} Apr 24 21:33:28.660129 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:28.660095 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nk9jw" event={"ID":"b7e233db-788d-485b-815f-9cd371ff230e","Type":"ContainerStarted","Data":"8fde261cea7c294b8e8462064c237ebb8c9041d6858baa2fa991a70ffba49877"} Apr 24 21:33:28.680835 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:28.680774 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nk9jw" podStartSLOduration=2.254801953 podStartE2EDuration="5.680756447s" podCreationTimestamp="2026-04-24 21:33:23 +0000 UTC" firstStartedPulling="2026-04-24 21:33:24.415350068 +0000 UTC m=+389.695197826" lastFinishedPulling="2026-04-24 21:33:27.841304562 +0000 UTC m=+393.121152320" observedRunningTime="2026-04-24 21:33:28.679851897 +0000 UTC m=+393.959699678" watchObservedRunningTime="2026-04-24 21:33:28.680756447 +0000 UTC m=+393.960604226" Apr 24 21:33:41.593166 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.593130 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2"] Apr 24 21:33:41.595711 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.595695 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.598107 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.598087 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-frsjb\"" Apr 24 21:33:41.598884 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.598866 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:33:41.599001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.598991 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:33:41.609436 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.609412 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2"] Apr 24 21:33:41.671113 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.671084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.671113 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.671117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.671368 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.671221 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5vh6\" (UniqueName: \"kubernetes.io/projected/7936543c-6ef1-4211-98d4-6a40de9bad6c-kube-api-access-w5vh6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.772573 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.772532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.772573 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.772574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.772805 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.772701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5vh6\" (UniqueName: \"kubernetes.io/projected/7936543c-6ef1-4211-98d4-6a40de9bad6c-kube-api-access-w5vh6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.772912 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.772897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.773010 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.772968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.782402 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.782367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5vh6\" (UniqueName: \"kubernetes.io/projected/7936543c-6ef1-4211-98d4-6a40de9bad6c-kube-api-access-w5vh6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:41.905434 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:41.905356 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:42.029200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:42.028989 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2"] Apr 24 21:33:42.031357 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:33:42.031332 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7936543c_6ef1_4211_98d4_6a40de9bad6c.slice/crio-c03c7227c3bdcbb546bdb9dda2df1efd7c5ffd31d7d404f808a722a6def940dc WatchSource:0}: Error finding container c03c7227c3bdcbb546bdb9dda2df1efd7c5ffd31d7d404f808a722a6def940dc: Status 404 returned error can't find the container with id c03c7227c3bdcbb546bdb9dda2df1efd7c5ffd31d7d404f808a722a6def940dc Apr 24 21:33:42.704200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:42.704164 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" event={"ID":"7936543c-6ef1-4211-98d4-6a40de9bad6c","Type":"ContainerStarted","Data":"c03c7227c3bdcbb546bdb9dda2df1efd7c5ffd31d7d404f808a722a6def940dc"} Apr 24 21:33:47.722019 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:47.721979 2573 generic.go:358] "Generic (PLEG): container finished" podID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerID="3cf0ec8b637c29e1388de47ee4502fbb4fe779cb1f252bacf1cfe2fd166238bc" exitCode=0 Apr 24 21:33:47.722383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:47.722073 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" event={"ID":"7936543c-6ef1-4211-98d4-6a40de9bad6c","Type":"ContainerDied","Data":"3cf0ec8b637c29e1388de47ee4502fbb4fe779cb1f252bacf1cfe2fd166238bc"} Apr 24 21:33:49.730862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:49.730828 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" event={"ID":"7936543c-6ef1-4211-98d4-6a40de9bad6c","Type":"ContainerStarted","Data":"cc6eea9d070e96da1d260e599a38b03e85fa6be41681cc3574190b2e8ed421c3"} Apr 24 21:33:50.734878 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:50.734846 2573 generic.go:358] "Generic (PLEG): container finished" podID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerID="cc6eea9d070e96da1d260e599a38b03e85fa6be41681cc3574190b2e8ed421c3" exitCode=0 Apr 24 21:33:50.735312 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:50.734948 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" event={"ID":"7936543c-6ef1-4211-98d4-6a40de9bad6c","Type":"ContainerDied","Data":"cc6eea9d070e96da1d260e599a38b03e85fa6be41681cc3574190b2e8ed421c3"} Apr 24 21:33:57.763173 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:57.763129 2573 generic.go:358] "Generic (PLEG): container finished" podID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerID="3e7d20bb3ea0e5185cb37d89514193a6f7fbdb1a6e1f5a46353998eb004340a8" exitCode=0 Apr 24 21:33:57.763519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:57.763225 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" event={"ID":"7936543c-6ef1-4211-98d4-6a40de9bad6c","Type":"ContainerDied","Data":"3e7d20bb3ea0e5185cb37d89514193a6f7fbdb1a6e1f5a46353998eb004340a8"} Apr 24 21:33:58.889083 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:58.889059 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:58.931042 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:58.931005 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5vh6\" (UniqueName: \"kubernetes.io/projected/7936543c-6ef1-4211-98d4-6a40de9bad6c-kube-api-access-w5vh6\") pod \"7936543c-6ef1-4211-98d4-6a40de9bad6c\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " Apr 24 21:33:58.931200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:58.931054 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-bundle\") pod \"7936543c-6ef1-4211-98d4-6a40de9bad6c\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " Apr 24 21:33:58.931200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:58.931125 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-util\") pod \"7936543c-6ef1-4211-98d4-6a40de9bad6c\" (UID: \"7936543c-6ef1-4211-98d4-6a40de9bad6c\") " Apr 24 21:33:58.931729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:58.931700 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-bundle" (OuterVolumeSpecName: "bundle") pod "7936543c-6ef1-4211-98d4-6a40de9bad6c" (UID: "7936543c-6ef1-4211-98d4-6a40de9bad6c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:58.933276 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:58.933246 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7936543c-6ef1-4211-98d4-6a40de9bad6c-kube-api-access-w5vh6" (OuterVolumeSpecName: "kube-api-access-w5vh6") pod "7936543c-6ef1-4211-98d4-6a40de9bad6c" (UID: "7936543c-6ef1-4211-98d4-6a40de9bad6c"). InnerVolumeSpecName "kube-api-access-w5vh6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:58.936869 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:58.936845 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-util" (OuterVolumeSpecName: "util") pod "7936543c-6ef1-4211-98d4-6a40de9bad6c" (UID: "7936543c-6ef1-4211-98d4-6a40de9bad6c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:59.032105 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:59.032026 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-util\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:59.032105 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:59.032055 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5vh6\" (UniqueName: \"kubernetes.io/projected/7936543c-6ef1-4211-98d4-6a40de9bad6c-kube-api-access-w5vh6\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:59.032105 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:59.032067 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7936543c-6ef1-4211-98d4-6a40de9bad6c-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:33:59.770139 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:59.770114 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" Apr 24 21:33:59.770293 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:59.770116 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czp4m2" event={"ID":"7936543c-6ef1-4211-98d4-6a40de9bad6c","Type":"ContainerDied","Data":"c03c7227c3bdcbb546bdb9dda2df1efd7c5ffd31d7d404f808a722a6def940dc"} Apr 24 21:33:59.770293 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:33:59.770220 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03c7227c3bdcbb546bdb9dda2df1efd7c5ffd31d7d404f808a722a6def940dc" Apr 24 21:34:11.214591 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.214555 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-nwz5t"] Apr 24 21:34:11.215061 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.215045 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerName="extract" Apr 24 21:34:11.215102 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.215064 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerName="extract" Apr 24 21:34:11.215102 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.215089 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerName="util" Apr 24 21:34:11.215102 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.215097 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerName="util" Apr 24 21:34:11.215205 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.215114 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerName="pull" Apr 24 21:34:11.215205 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.215121 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerName="pull" Apr 24 21:34:11.215265 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.215205 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7936543c-6ef1-4211-98d4-6a40de9bad6c" containerName="extract" Apr 24 21:34:11.220207 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.220182 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.223262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.223231 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:34:11.223366 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.223288 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:34:11.223511 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.223496 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:34:11.223706 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.223691 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-nm9wf\"" Apr 24 21:34:11.224000 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.223983 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:34:11.224834 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.224820 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:34:11.238605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.238578 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-nwz5t"] Apr 24 21:34:11.321659 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.321626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5dl7\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-kube-api-access-k5dl7\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.321842 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.321669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3b7f82a5-8d1a-4493-8ef1-60308b9073df-cabundle0\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.321842 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.321703 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.422161 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.422132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5dl7\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-kube-api-access-k5dl7\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.422339 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.422168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3b7f82a5-8d1a-4493-8ef1-60308b9073df-cabundle0\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.422339 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.422206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.422339 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:11.422322 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:34:11.422339 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:11.422338 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:34:11.422535 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:11.422349 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nwz5t: references non-existent secret key: ca.crt Apr 24 21:34:11.422535 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:11.422420 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates podName:3b7f82a5-8d1a-4493-8ef1-60308b9073df nodeName:}" failed. No retries permitted until 2026-04-24 21:34:11.922399627 +0000 UTC m=+437.202247386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates") pod "keda-operator-ffbb595cb-nwz5t" (UID: "3b7f82a5-8d1a-4493-8ef1-60308b9073df") : references non-existent secret key: ca.crt Apr 24 21:34:11.422863 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.422843 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3b7f82a5-8d1a-4493-8ef1-60308b9073df-cabundle0\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.435913 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.435886 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5dl7\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-kube-api-access-k5dl7\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.927138 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:11.927099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:11.927352 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:11.927243 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:34:11.927352 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:11.927275 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:34:11.927352 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:11.927284 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nwz5t: references non-existent secret key: ca.crt Apr 24 21:34:11.927352 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:11.927349 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates podName:3b7f82a5-8d1a-4493-8ef1-60308b9073df nodeName:}" failed. No retries permitted until 2026-04-24 21:34:12.927333019 +0000 UTC m=+438.207180778 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates") pod "keda-operator-ffbb595cb-nwz5t" (UID: "3b7f82a5-8d1a-4493-8ef1-60308b9073df") : references non-existent secret key: ca.crt Apr 24 21:34:12.938269 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:12.938241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:12.938716 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:12.938390 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:34:12.938716 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:12.938405 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:34:12.938716 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:12.938414 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nwz5t: references non-existent secret key: ca.crt Apr 24 21:34:12.938716 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:12.938467 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates podName:3b7f82a5-8d1a-4493-8ef1-60308b9073df nodeName:}" failed. No retries permitted until 2026-04-24 21:34:14.938450034 +0000 UTC m=+440.218297805 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates") pod "keda-operator-ffbb595cb-nwz5t" (UID: "3b7f82a5-8d1a-4493-8ef1-60308b9073df") : references non-existent secret key: ca.crt Apr 24 21:34:14.954184 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:14.954101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:14.954530 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:14.954228 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:34:14.954530 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:14.954243 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:34:14.954530 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:14.954251 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-nwz5t: references non-existent secret key: ca.crt Apr 24 21:34:14.954530 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:34:14.954310 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates podName:3b7f82a5-8d1a-4493-8ef1-60308b9073df nodeName:}" failed. No retries permitted until 2026-04-24 21:34:18.954293551 +0000 UTC m=+444.234141312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates") pod "keda-operator-ffbb595cb-nwz5t" (UID: "3b7f82a5-8d1a-4493-8ef1-60308b9073df") : references non-existent secret key: ca.crt Apr 24 21:34:18.987448 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:18.987407 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:18.989948 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:18.989901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3b7f82a5-8d1a-4493-8ef1-60308b9073df-certificates\") pod \"keda-operator-ffbb595cb-nwz5t\" (UID: \"3b7f82a5-8d1a-4493-8ef1-60308b9073df\") " pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:19.031716 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:19.031671 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:19.156547 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:19.156519 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-nwz5t"] Apr 24 21:34:19.158541 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:34:19.158508 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b7f82a5_8d1a_4493_8ef1_60308b9073df.slice/crio-14b2b1b52e68782b413171555b58005f67a26544c6fbb9112aff3bb54199a076 WatchSource:0}: Error finding container 14b2b1b52e68782b413171555b58005f67a26544c6fbb9112aff3bb54199a076: Status 404 returned error can't find the container with id 14b2b1b52e68782b413171555b58005f67a26544c6fbb9112aff3bb54199a076 Apr 24 21:34:19.834512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:19.834471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" event={"ID":"3b7f82a5-8d1a-4493-8ef1-60308b9073df","Type":"ContainerStarted","Data":"14b2b1b52e68782b413171555b58005f67a26544c6fbb9112aff3bb54199a076"} Apr 24 21:34:23.848555 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:23.848518 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" event={"ID":"3b7f82a5-8d1a-4493-8ef1-60308b9073df","Type":"ContainerStarted","Data":"275295ed22dd1b6e14e2a21ee21d405f3495b77884b76fe17344c117ccb72c4e"} Apr 24 21:34:23.849029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:23.848615 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:34:23.867440 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:23.867384 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" podStartSLOduration=9.058735343 podStartE2EDuration="12.867369158s" podCreationTimestamp="2026-04-24 21:34:11 +0000 UTC" firstStartedPulling="2026-04-24 21:34:19.159770602 +0000 UTC m=+444.439618360" lastFinishedPulling="2026-04-24 21:34:22.968404417 +0000 UTC m=+448.248252175" observedRunningTime="2026-04-24 21:34:23.865517959 +0000 UTC m=+449.145365740" watchObservedRunningTime="2026-04-24 21:34:23.867369158 +0000 UTC m=+449.147216938" Apr 24 21:34:44.854175 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:44.854145 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-nwz5t" Apr 24 21:35:16.850877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.850840 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84b6647887-5mqsv"] Apr 24 21:35:16.854324 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.854302 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:16.856814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.856782 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:35:16.857492 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.857472 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:35:16.857684 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.857477 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-fstsm\"" Apr 24 21:35:16.857828 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.857493 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:35:16.864150 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.864130 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-5mqsv"] Apr 24 21:35:16.911008 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.910963 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-48bcc"] Apr 24 21:35:16.914417 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.914397 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:16.917067 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.917043 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-hlqpc\"" Apr 24 21:35:16.918276 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.918253 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:35:16.921023 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.921002 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-48bcc"] Apr 24 21:35:16.987800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.987765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtbvd\" (UniqueName: \"kubernetes.io/projected/43e51c33-1433-4f75-95b7-aac90eba0279-kube-api-access-vtbvd\") pod \"seaweedfs-86cc847c5c-48bcc\" (UID: \"43e51c33-1433-4f75-95b7-aac90eba0279\") " pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:16.987800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.987809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7q4\" (UniqueName: \"kubernetes.io/projected/deeaf18b-d5a7-47c9-9983-ca0643a74a44-kube-api-access-9f7q4\") pod \"kserve-controller-manager-84b6647887-5mqsv\" (UID: \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\") " pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:16.988166 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.987910 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deeaf18b-d5a7-47c9-9983-ca0643a74a44-cert\") pod \"kserve-controller-manager-84b6647887-5mqsv\" (UID: \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\") " pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:16.988166 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:16.987971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/43e51c33-1433-4f75-95b7-aac90eba0279-data\") pod \"seaweedfs-86cc847c5c-48bcc\" (UID: \"43e51c33-1433-4f75-95b7-aac90eba0279\") " pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:17.089185 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.089155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtbvd\" (UniqueName: \"kubernetes.io/projected/43e51c33-1433-4f75-95b7-aac90eba0279-kube-api-access-vtbvd\") pod \"seaweedfs-86cc847c5c-48bcc\" (UID: \"43e51c33-1433-4f75-95b7-aac90eba0279\") " pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:17.089350 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.089197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7q4\" (UniqueName: \"kubernetes.io/projected/deeaf18b-d5a7-47c9-9983-ca0643a74a44-kube-api-access-9f7q4\") pod \"kserve-controller-manager-84b6647887-5mqsv\" (UID: \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\") " pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:17.089350 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.089252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deeaf18b-d5a7-47c9-9983-ca0643a74a44-cert\") pod \"kserve-controller-manager-84b6647887-5mqsv\" (UID: \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\") " pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:17.089350 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.089277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/43e51c33-1433-4f75-95b7-aac90eba0279-data\") pod \"seaweedfs-86cc847c5c-48bcc\" (UID: \"43e51c33-1433-4f75-95b7-aac90eba0279\") " pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:17.089701 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.089669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/43e51c33-1433-4f75-95b7-aac90eba0279-data\") pod \"seaweedfs-86cc847c5c-48bcc\" (UID: \"43e51c33-1433-4f75-95b7-aac90eba0279\") " pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:17.091518 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.091491 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deeaf18b-d5a7-47c9-9983-ca0643a74a44-cert\") pod \"kserve-controller-manager-84b6647887-5mqsv\" (UID: \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\") " pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:17.103121 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.103067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtbvd\" (UniqueName: \"kubernetes.io/projected/43e51c33-1433-4f75-95b7-aac90eba0279-kube-api-access-vtbvd\") pod \"seaweedfs-86cc847c5c-48bcc\" (UID: \"43e51c33-1433-4f75-95b7-aac90eba0279\") " pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:17.123045 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.123025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7q4\" (UniqueName: \"kubernetes.io/projected/deeaf18b-d5a7-47c9-9983-ca0643a74a44-kube-api-access-9f7q4\") pod \"kserve-controller-manager-84b6647887-5mqsv\" (UID: \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\") " pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:17.168162 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.168128 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:17.227266 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.227168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:17.310157 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.309980 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-5mqsv"] Apr 24 21:35:17.312275 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:35:17.312192 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeeaf18b_d5a7_47c9_9983_ca0643a74a44.slice/crio-7bd168c6fde8468ec3515e6dab5d88e96f05f67f77db5468548ba5dd531f93ea WatchSource:0}: Error finding container 7bd168c6fde8468ec3515e6dab5d88e96f05f67f77db5468548ba5dd531f93ea: Status 404 returned error can't find the container with id 7bd168c6fde8468ec3515e6dab5d88e96f05f67f77db5468548ba5dd531f93ea Apr 24 21:35:17.368873 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:17.368846 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-48bcc"] Apr 24 21:35:17.369424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:35:17.369401 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e51c33_1433_4f75_95b7_aac90eba0279.slice/crio-36a1f75d115299777f9de587fc7e4c66970803f870e7dabde2f1cdab54ea4811 WatchSource:0}: Error finding container 36a1f75d115299777f9de587fc7e4c66970803f870e7dabde2f1cdab54ea4811: Status 404 returned error can't find the container with id 36a1f75d115299777f9de587fc7e4c66970803f870e7dabde2f1cdab54ea4811 Apr 24 21:35:18.006113 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:18.006072 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-48bcc" event={"ID":"43e51c33-1433-4f75-95b7-aac90eba0279","Type":"ContainerStarted","Data":"36a1f75d115299777f9de587fc7e4c66970803f870e7dabde2f1cdab54ea4811"} Apr 24 21:35:18.008164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:18.008131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" event={"ID":"deeaf18b-d5a7-47c9-9983-ca0643a74a44","Type":"ContainerStarted","Data":"7bd168c6fde8468ec3515e6dab5d88e96f05f67f77db5468548ba5dd531f93ea"} Apr 24 21:35:21.020446 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:21.020412 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-48bcc" event={"ID":"43e51c33-1433-4f75-95b7-aac90eba0279","Type":"ContainerStarted","Data":"1f0446fd8e128f3fb51dac7eb2f4bb38eefaad5dc82e7f48364b5e3f6db0b2dd"} Apr 24 21:35:21.020895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:21.020533 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:21.021844 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:21.021822 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" event={"ID":"deeaf18b-d5a7-47c9-9983-ca0643a74a44","Type":"ContainerStarted","Data":"caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f"} Apr 24 21:35:21.021943 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:21.021893 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:21.036469 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:21.036421 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-48bcc" podStartSLOduration=1.5071811240000001 podStartE2EDuration="5.036404659s" podCreationTimestamp="2026-04-24 21:35:16 +0000 UTC" firstStartedPulling="2026-04-24 21:35:17.370671809 +0000 UTC m=+502.650519567" lastFinishedPulling="2026-04-24 21:35:20.899895341 +0000 UTC m=+506.179743102" observedRunningTime="2026-04-24 21:35:21.034886244 +0000 UTC m=+506.314734036" watchObservedRunningTime="2026-04-24 21:35:21.036404659 +0000 UTC m=+506.316252440" Apr 24 21:35:21.050647 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:21.050593 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" podStartSLOduration=1.55905681 podStartE2EDuration="5.050575231s" podCreationTimestamp="2026-04-24 21:35:16 +0000 UTC" firstStartedPulling="2026-04-24 21:35:17.313801629 +0000 UTC m=+502.593649387" lastFinishedPulling="2026-04-24 21:35:20.805320047 +0000 UTC m=+506.085167808" observedRunningTime="2026-04-24 21:35:21.049464368 +0000 UTC m=+506.329312160" watchObservedRunningTime="2026-04-24 21:35:21.050575231 +0000 UTC m=+506.330423012" Apr 24 21:35:27.027692 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:27.027659 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-48bcc" Apr 24 21:35:51.523890 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.523802 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-5mqsv"] Apr 24 21:35:51.524453 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.524152 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" podUID="deeaf18b-d5a7-47c9-9983-ca0643a74a44" containerName="manager" containerID="cri-o://caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f" gracePeriod=10 Apr 24 21:35:51.529039 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.529014 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:51.550854 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.550831 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84b6647887-nxnwd"] Apr 24 21:35:51.554563 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.554546 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:35:51.566744 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.566718 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-nxnwd"] Apr 24 21:35:51.580934 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.580893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q25bz\" (UniqueName: \"kubernetes.io/projected/38ce435e-8954-419b-90ed-d616f45f2f59-kube-api-access-q25bz\") pod \"kserve-controller-manager-84b6647887-nxnwd\" (UID: \"38ce435e-8954-419b-90ed-d616f45f2f59\") " pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:35:51.581071 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.580959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38ce435e-8954-419b-90ed-d616f45f2f59-cert\") pod \"kserve-controller-manager-84b6647887-nxnwd\" (UID: \"38ce435e-8954-419b-90ed-d616f45f2f59\") " pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:35:51.682580 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.682542 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q25bz\" (UniqueName: \"kubernetes.io/projected/38ce435e-8954-419b-90ed-d616f45f2f59-kube-api-access-q25bz\") pod \"kserve-controller-manager-84b6647887-nxnwd\" (UID: \"38ce435e-8954-419b-90ed-d616f45f2f59\") " pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:35:51.682755 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.682595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38ce435e-8954-419b-90ed-d616f45f2f59-cert\") pod \"kserve-controller-manager-84b6647887-nxnwd\" (UID: \"38ce435e-8954-419b-90ed-d616f45f2f59\") " pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:35:51.685029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.685003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38ce435e-8954-419b-90ed-d616f45f2f59-cert\") pod \"kserve-controller-manager-84b6647887-nxnwd\" (UID: \"38ce435e-8954-419b-90ed-d616f45f2f59\") " pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:35:51.691124 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.690867 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q25bz\" (UniqueName: \"kubernetes.io/projected/38ce435e-8954-419b-90ed-d616f45f2f59-kube-api-access-q25bz\") pod \"kserve-controller-manager-84b6647887-nxnwd\" (UID: \"38ce435e-8954-419b-90ed-d616f45f2f59\") " pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:35:51.763473 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.763444 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:51.783092 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.783024 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f7q4\" (UniqueName: \"kubernetes.io/projected/deeaf18b-d5a7-47c9-9983-ca0643a74a44-kube-api-access-9f7q4\") pod \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\" (UID: \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\") " Apr 24 21:35:51.783092 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.783056 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deeaf18b-d5a7-47c9-9983-ca0643a74a44-cert\") pod \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\" (UID: \"deeaf18b-d5a7-47c9-9983-ca0643a74a44\") " Apr 24 21:35:51.785394 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.785360 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deeaf18b-d5a7-47c9-9983-ca0643a74a44-cert" (OuterVolumeSpecName: "cert") pod "deeaf18b-d5a7-47c9-9983-ca0643a74a44" (UID: "deeaf18b-d5a7-47c9-9983-ca0643a74a44"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:51.785525 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.785404 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deeaf18b-d5a7-47c9-9983-ca0643a74a44-kube-api-access-9f7q4" (OuterVolumeSpecName: "kube-api-access-9f7q4") pod "deeaf18b-d5a7-47c9-9983-ca0643a74a44" (UID: "deeaf18b-d5a7-47c9-9983-ca0643a74a44"). InnerVolumeSpecName "kube-api-access-9f7q4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:51.883960 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.883897 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9f7q4\" (UniqueName: \"kubernetes.io/projected/deeaf18b-d5a7-47c9-9983-ca0643a74a44-kube-api-access-9f7q4\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:35:51.883960 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.883953 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deeaf18b-d5a7-47c9-9983-ca0643a74a44-cert\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:35:51.906615 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:51.906591 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:35:52.027305 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.027280 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-nxnwd"] Apr 24 21:35:52.029958 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:35:52.029929 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ce435e_8954_419b_90ed_d616f45f2f59.slice/crio-4c94ffcd00efb67d775b2002bc63b1636ee5c1887e24d19e13d065bc1d4b5b70 WatchSource:0}: Error finding container 4c94ffcd00efb67d775b2002bc63b1636ee5c1887e24d19e13d065bc1d4b5b70: Status 404 returned error can't find the container with id 4c94ffcd00efb67d775b2002bc63b1636ee5c1887e24d19e13d065bc1d4b5b70 Apr 24 21:35:52.124562 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.124532 2573 generic.go:358] "Generic (PLEG): container finished" podID="deeaf18b-d5a7-47c9-9983-ca0643a74a44" containerID="caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f" exitCode=0 Apr 24 21:35:52.124763 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.124601 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" Apr 24 21:35:52.124763 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.124607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" event={"ID":"deeaf18b-d5a7-47c9-9983-ca0643a74a44","Type":"ContainerDied","Data":"caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f"} Apr 24 21:35:52.124763 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.124657 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-5mqsv" event={"ID":"deeaf18b-d5a7-47c9-9983-ca0643a74a44","Type":"ContainerDied","Data":"7bd168c6fde8468ec3515e6dab5d88e96f05f67f77db5468548ba5dd531f93ea"} Apr 24 21:35:52.124763 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.124677 2573 scope.go:117] "RemoveContainer" containerID="caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f" Apr 24 21:35:52.125701 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.125677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-nxnwd" event={"ID":"38ce435e-8954-419b-90ed-d616f45f2f59","Type":"ContainerStarted","Data":"4c94ffcd00efb67d775b2002bc63b1636ee5c1887e24d19e13d065bc1d4b5b70"} Apr 24 21:35:52.132490 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.132467 2573 scope.go:117] "RemoveContainer" containerID="caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f" Apr 24 21:35:52.132745 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:35:52.132721 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f\": container with ID starting with caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f not found: ID does not exist" containerID="caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f" Apr 24 21:35:52.132838 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.132756 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f"} err="failed to get container status \"caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f\": rpc error: code = NotFound desc = could not find container \"caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f\": container with ID starting with caf2bd9606d1863fd5bab26198a67e042960875c70a92f26c6a45b23847fc50f not found: ID does not exist" Apr 24 21:35:52.145710 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.145689 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-5mqsv"] Apr 24 21:35:52.149509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:52.149489 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-5mqsv"] Apr 24 21:35:53.131211 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:53.131173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-nxnwd" event={"ID":"38ce435e-8954-419b-90ed-d616f45f2f59","Type":"ContainerStarted","Data":"332cf5252a7a102d3816174f1508d4be015e4f061bf0a5c7e2bcbac6785d80c6"} Apr 24 21:35:53.131565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:53.131228 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:35:53.149364 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:53.149317 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84b6647887-nxnwd" podStartSLOduration=1.8071918139999998 podStartE2EDuration="2.149303241s" podCreationTimestamp="2026-04-24 21:35:51 +0000 UTC" firstStartedPulling="2026-04-24 21:35:52.031219632 +0000 UTC m=+537.311067390" lastFinishedPulling="2026-04-24 21:35:52.373331058 +0000 UTC m=+537.653178817" observedRunningTime="2026-04-24 21:35:53.148012784 +0000 UTC m=+538.427860561" watchObservedRunningTime="2026-04-24 21:35:53.149303241 +0000 UTC m=+538.429151021" Apr 24 21:35:53.372875 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:53.372840 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deeaf18b-d5a7-47c9-9983-ca0643a74a44" path="/var/lib/kubelet/pods/deeaf18b-d5a7-47c9-9983-ca0643a74a44/volumes" Apr 24 21:36:24.140149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:24.140117 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84b6647887-nxnwd" Apr 24 21:36:25.169850 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.169811 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-f8dp4"] Apr 24 21:36:25.170226 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.170165 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deeaf18b-d5a7-47c9-9983-ca0643a74a44" containerName="manager" Apr 24 21:36:25.170226 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.170178 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="deeaf18b-d5a7-47c9-9983-ca0643a74a44" containerName="manager" Apr 24 21:36:25.170226 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.170227 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="deeaf18b-d5a7-47c9-9983-ca0643a74a44" containerName="manager" Apr 24 21:36:25.173125 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.173107 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:25.185870 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.185846 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-82crs\"" Apr 24 21:36:25.204261 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.204235 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:36:25.217273 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.217247 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-f8dp4"] Apr 24 21:36:25.242211 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.242178 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-dtvqv"] Apr 24 21:36:25.245851 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.245834 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:25.248454 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.248433 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:36:25.248606 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.248589 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-wl4m8\"" Apr 24 21:36:25.255944 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.255902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5-tls-certs\") pod \"model-serving-api-86f7b4b499-f8dp4\" (UID: \"b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5\") " pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:25.256026 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.255975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjmtr\" (UniqueName: \"kubernetes.io/projected/b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5-kube-api-access-kjmtr\") pod \"model-serving-api-86f7b4b499-f8dp4\" (UID: \"b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5\") " pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:25.266967 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.266946 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-dtvqv"] Apr 24 21:36:25.357226 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.357190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5-tls-certs\") pod \"model-serving-api-86f7b4b499-f8dp4\" (UID: \"b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5\") " pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:25.357407 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.357241 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7xf2\" (UniqueName: \"kubernetes.io/projected/d5f7c2e0-03d7-4431-9edd-91dc7f3bf016-kube-api-access-m7xf2\") pod \"odh-model-controller-696fc77849-dtvqv\" (UID: \"d5f7c2e0-03d7-4431-9edd-91dc7f3bf016\") " pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:25.357407 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.357277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjmtr\" (UniqueName: \"kubernetes.io/projected/b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5-kube-api-access-kjmtr\") pod \"model-serving-api-86f7b4b499-f8dp4\" (UID: \"b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5\") " pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:25.357407 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.357312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5f7c2e0-03d7-4431-9edd-91dc7f3bf016-cert\") pod \"odh-model-controller-696fc77849-dtvqv\" (UID: \"d5f7c2e0-03d7-4431-9edd-91dc7f3bf016\") " pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:25.357407 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:36:25.357339 2573 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 21:36:25.357407 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:36:25.357401 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5-tls-certs podName:b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5 nodeName:}" failed. No retries permitted until 2026-04-24 21:36:25.857382615 +0000 UTC m=+571.137230373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5-tls-certs") pod "model-serving-api-86f7b4b499-f8dp4" (UID: "b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5") : secret "model-serving-api-tls" not found Apr 24 21:36:25.370642 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.370620 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjmtr\" (UniqueName: \"kubernetes.io/projected/b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5-kube-api-access-kjmtr\") pod \"model-serving-api-86f7b4b499-f8dp4\" (UID: \"b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5\") " pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:25.458125 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.458043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7xf2\" (UniqueName: \"kubernetes.io/projected/d5f7c2e0-03d7-4431-9edd-91dc7f3bf016-kube-api-access-m7xf2\") pod \"odh-model-controller-696fc77849-dtvqv\" (UID: \"d5f7c2e0-03d7-4431-9edd-91dc7f3bf016\") " pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:25.458125 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.458090 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5f7c2e0-03d7-4431-9edd-91dc7f3bf016-cert\") pod \"odh-model-controller-696fc77849-dtvqv\" (UID: \"d5f7c2e0-03d7-4431-9edd-91dc7f3bf016\") " pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:25.460385 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.460358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5f7c2e0-03d7-4431-9edd-91dc7f3bf016-cert\") pod \"odh-model-controller-696fc77849-dtvqv\" (UID: \"d5f7c2e0-03d7-4431-9edd-91dc7f3bf016\") " pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:25.470295 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.470274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7xf2\" (UniqueName: \"kubernetes.io/projected/d5f7c2e0-03d7-4431-9edd-91dc7f3bf016-kube-api-access-m7xf2\") pod \"odh-model-controller-696fc77849-dtvqv\" (UID: \"d5f7c2e0-03d7-4431-9edd-91dc7f3bf016\") " pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:25.556511 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.556470 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:25.686414 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.686387 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-dtvqv"] Apr 24 21:36:25.689054 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:36:25.689024 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5f7c2e0_03d7_4431_9edd_91dc7f3bf016.slice/crio-22d42fc15ff1f4a70b83bf952b5fae10fad1dea0ee4794ae38b7481893aad3eb WatchSource:0}: Error finding container 22d42fc15ff1f4a70b83bf952b5fae10fad1dea0ee4794ae38b7481893aad3eb: Status 404 returned error can't find the container with id 22d42fc15ff1f4a70b83bf952b5fae10fad1dea0ee4794ae38b7481893aad3eb Apr 24 21:36:25.862427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.862396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5-tls-certs\") pod \"model-serving-api-86f7b4b499-f8dp4\" (UID: \"b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5\") " pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:25.864758 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.864739 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5-tls-certs\") pod \"model-serving-api-86f7b4b499-f8dp4\" (UID: \"b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5\") " pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:26.083521 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.083486 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:26.233746 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.233709 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-dtvqv" event={"ID":"d5f7c2e0-03d7-4431-9edd-91dc7f3bf016","Type":"ContainerStarted","Data":"22d42fc15ff1f4a70b83bf952b5fae10fad1dea0ee4794ae38b7481893aad3eb"} Apr 24 21:36:26.270511 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.270469 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-f8dp4"] Apr 24 21:36:26.272049 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:36:26.272009 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb042dc72_bdf2_4b0a_9f5a_c15cd5f6f1f5.slice/crio-0cd91af75c4ea147c05cef6f5e20be2d8fa4d356370f6e8575fcf3aa35f9aa83 WatchSource:0}: Error finding container 0cd91af75c4ea147c05cef6f5e20be2d8fa4d356370f6e8575fcf3aa35f9aa83: Status 404 returned error can't find the container with id 0cd91af75c4ea147c05cef6f5e20be2d8fa4d356370f6e8575fcf3aa35f9aa83 Apr 24 21:36:27.239705 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:27.239661 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-f8dp4" event={"ID":"b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5","Type":"ContainerStarted","Data":"0cd91af75c4ea147c05cef6f5e20be2d8fa4d356370f6e8575fcf3aa35f9aa83"} Apr 24 21:36:29.248766 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:29.248727 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-dtvqv" event={"ID":"d5f7c2e0-03d7-4431-9edd-91dc7f3bf016","Type":"ContainerStarted","Data":"9556e3ad9bc0c2824a44229ff6e6cf8ded6c2e49f91cd2f334de20af291c565c"} Apr 24 21:36:29.249260 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:29.248809 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:29.250045 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:29.250023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-f8dp4" event={"ID":"b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5","Type":"ContainerStarted","Data":"d7d83cbc77327e785fc57407744f534718704861c4c91ddaa3459d74d4d9747d"} Apr 24 21:36:29.250147 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:29.250127 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:29.266846 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:29.266791 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-dtvqv" podStartSLOduration=1.385901384 podStartE2EDuration="4.266780379s" podCreationTimestamp="2026-04-24 21:36:25 +0000 UTC" firstStartedPulling="2026-04-24 21:36:25.690291953 +0000 UTC m=+570.970139715" lastFinishedPulling="2026-04-24 21:36:28.571170939 +0000 UTC m=+573.851018710" observedRunningTime="2026-04-24 21:36:29.264766558 +0000 UTC m=+574.544614336" watchObservedRunningTime="2026-04-24 21:36:29.266780379 +0000 UTC m=+574.546628158" Apr 24 21:36:29.286173 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:29.286126 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-f8dp4" podStartSLOduration=1.9536847000000002 podStartE2EDuration="4.286112229s" podCreationTimestamp="2026-04-24 21:36:25 +0000 UTC" firstStartedPulling="2026-04-24 21:36:26.274337965 +0000 UTC m=+571.554185723" lastFinishedPulling="2026-04-24 21:36:28.606765491 +0000 UTC m=+573.886613252" observedRunningTime="2026-04-24 21:36:29.284767334 +0000 UTC m=+574.564615115" watchObservedRunningTime="2026-04-24 21:36:29.286112229 +0000 UTC m=+574.565960010" Apr 24 21:36:30.041457 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.041417 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cddbf496b-qxs8r"] Apr 24 21:36:30.044779 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.044755 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.053981 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.053957 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cddbf496b-qxs8r"] Apr 24 21:36:30.202723 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.202683 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-oauth-serving-cert\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.202723 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.202727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-service-ca\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.202946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.202748 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1a97060-50fd-41b0-abc2-a5a8a845b124-console-oauth-config\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.202946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.202806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a97060-50fd-41b0-abc2-a5a8a845b124-console-serving-cert\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.202946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.202836 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-trusted-ca-bundle\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.202946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.202862 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xngbd\" (UniqueName: \"kubernetes.io/projected/a1a97060-50fd-41b0-abc2-a5a8a845b124-kube-api-access-xngbd\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.203079 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.202953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-console-config\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.303775 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.303687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-console-config\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.303775 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.303751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-oauth-serving-cert\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.303775 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.303772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-service-ca\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.304416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.303793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1a97060-50fd-41b0-abc2-a5a8a845b124-console-oauth-config\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.304416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.303816 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a97060-50fd-41b0-abc2-a5a8a845b124-console-serving-cert\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.304416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.303833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-trusted-ca-bundle\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.304416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.303856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xngbd\" (UniqueName: \"kubernetes.io/projected/a1a97060-50fd-41b0-abc2-a5a8a845b124-kube-api-access-xngbd\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.304628 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.304530 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-console-config\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.304715 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.304669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-trusted-ca-bundle\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.304715 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.304670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-oauth-serving-cert\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.305123 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.305101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1a97060-50fd-41b0-abc2-a5a8a845b124-service-ca\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.306936 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.306884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1a97060-50fd-41b0-abc2-a5a8a845b124-console-oauth-config\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.307034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.306987 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a97060-50fd-41b0-abc2-a5a8a845b124-console-serving-cert\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.315330 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.315309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xngbd\" (UniqueName: \"kubernetes.io/projected/a1a97060-50fd-41b0-abc2-a5a8a845b124-kube-api-access-xngbd\") pod \"console-6cddbf496b-qxs8r\" (UID: \"a1a97060-50fd-41b0-abc2-a5a8a845b124\") " pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.354772 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.354742 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:30.481459 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:30.481430 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cddbf496b-qxs8r"] Apr 24 21:36:30.483190 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:36:30.483164 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a97060_50fd_41b0_abc2_a5a8a845b124.slice/crio-7dec7fc3b5d06fc5da62b9543cb1b5c31e28721c675144e5c96bb330e3bb0ed5 WatchSource:0}: Error finding container 7dec7fc3b5d06fc5da62b9543cb1b5c31e28721c675144e5c96bb330e3bb0ed5: Status 404 returned error can't find the container with id 7dec7fc3b5d06fc5da62b9543cb1b5c31e28721c675144e5c96bb330e3bb0ed5 Apr 24 21:36:31.258247 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:31.258214 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cddbf496b-qxs8r" event={"ID":"a1a97060-50fd-41b0-abc2-a5a8a845b124","Type":"ContainerStarted","Data":"74f6f9e0a9022d06be696c151bbce5f6a9b461a969f96ec8d81154b53d5adc32"} Apr 24 21:36:31.258247 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:31.258253 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cddbf496b-qxs8r" event={"ID":"a1a97060-50fd-41b0-abc2-a5a8a845b124","Type":"ContainerStarted","Data":"7dec7fc3b5d06fc5da62b9543cb1b5c31e28721c675144e5c96bb330e3bb0ed5"} Apr 24 21:36:31.280164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:31.280118 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cddbf496b-qxs8r" podStartSLOduration=1.280104704 podStartE2EDuration="1.280104704s" podCreationTimestamp="2026-04-24 21:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:31.277664338 +0000 UTC m=+576.557512118" watchObservedRunningTime="2026-04-24 21:36:31.280104704 +0000 UTC m=+576.559952484" Apr 24 21:36:40.255456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:40.255425 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-dtvqv" Apr 24 21:36:40.257340 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:40.257315 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-f8dp4" Apr 24 21:36:40.355768 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:40.355727 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:40.355768 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:40.355772 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:40.360683 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:40.360659 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:41.295649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:41.295621 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cddbf496b-qxs8r" Apr 24 21:36:41.357581 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:41.357542 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75b584996b-2qpgz"] Apr 24 21:36:55.267789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.267758 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:36:55.268246 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.267831 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:37:01.106890 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.106853 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz"] Apr 24 21:37:01.110499 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.110466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.112845 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.112815 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-914c2-predictor-serving-cert\"" Apr 24 21:37:01.113744 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.113598 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6ztvm\"" Apr 24 21:37:01.113744 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.113616 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-914c2-kube-rbac-proxy-sar-config\"" Apr 24 21:37:01.113744 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.113642 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:37:01.113744 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.113641 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:37:01.120705 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.120681 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz"] Apr 24 21:37:01.139491 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.139454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-success-200-isvc-914c2-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.139649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.139507 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zdnp\" (UniqueName: \"kubernetes.io/projected/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-kube-api-access-6zdnp\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.139649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.139588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-proxy-tls\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.207056 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.207024 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292"] Apr 24 21:37:01.210867 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.210844 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.213666 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.213350 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-914c2-predictor-serving-cert\"" Apr 24 21:37:01.213666 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.213422 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-914c2-kube-rbac-proxy-sar-config\"" Apr 24 21:37:01.221940 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.221767 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292"] Apr 24 21:37:01.240398 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.240362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-success-200-isvc-914c2-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.240565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.240404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9476e4c7-3607-4bc3-90dd-ed599afc66d8-error-404-isvc-914c2-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.240565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.240440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zdnp\" (UniqueName: \"kubernetes.io/projected/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-kube-api-access-6zdnp\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.240565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.240493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9476e4c7-3607-4bc3-90dd-ed599afc66d8-proxy-tls\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.240735 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.240557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-proxy-tls\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.240735 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.240598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf9cq\" (UniqueName: \"kubernetes.io/projected/9476e4c7-3607-4bc3-90dd-ed599afc66d8-kube-api-access-xf9cq\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.240735 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:37:01.240661 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-914c2-predictor-serving-cert: secret "success-200-isvc-914c2-predictor-serving-cert" not found Apr 24 21:37:01.240735 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:37:01.240724 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-proxy-tls podName:49d0a4d9-3576-4587-ab0b-e2df7f7b45c8 nodeName:}" failed. No retries permitted until 2026-04-24 21:37:01.740703685 +0000 UTC m=+607.020551451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-proxy-tls") pod "success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" (UID: "49d0a4d9-3576-4587-ab0b-e2df7f7b45c8") : secret "success-200-isvc-914c2-predictor-serving-cert" not found Apr 24 21:37:01.241117 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.241094 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-success-200-isvc-914c2-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.259881 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.259850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zdnp\" (UniqueName: \"kubernetes.io/projected/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-kube-api-access-6zdnp\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.341337 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.341294 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xf9cq\" (UniqueName: \"kubernetes.io/projected/9476e4c7-3607-4bc3-90dd-ed599afc66d8-kube-api-access-xf9cq\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.341539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.341382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9476e4c7-3607-4bc3-90dd-ed599afc66d8-error-404-isvc-914c2-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.341539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.341433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9476e4c7-3607-4bc3-90dd-ed599afc66d8-proxy-tls\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.341646 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:37:01.341553 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-914c2-predictor-serving-cert: secret "error-404-isvc-914c2-predictor-serving-cert" not found Apr 24 21:37:01.341646 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:37:01.341616 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9476e4c7-3607-4bc3-90dd-ed599afc66d8-proxy-tls podName:9476e4c7-3607-4bc3-90dd-ed599afc66d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:37:01.841602589 +0000 UTC m=+607.121450347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9476e4c7-3607-4bc3-90dd-ed599afc66d8-proxy-tls") pod "error-404-isvc-914c2-predictor-78956bf458-s7292" (UID: "9476e4c7-3607-4bc3-90dd-ed599afc66d8") : secret "error-404-isvc-914c2-predictor-serving-cert" not found Apr 24 21:37:01.342211 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.342185 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9476e4c7-3607-4bc3-90dd-ed599afc66d8-error-404-isvc-914c2-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.350816 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.350788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf9cq\" (UniqueName: \"kubernetes.io/projected/9476e4c7-3607-4bc3-90dd-ed599afc66d8-kube-api-access-xf9cq\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.745811 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.745761 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-proxy-tls\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.748287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.748249 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-proxy-tls\") pod \"success-200-isvc-914c2-predictor-7db85f97bf-5zqtz\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:01.846319 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.846284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9476e4c7-3607-4bc3-90dd-ed599afc66d8-proxy-tls\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:01.848710 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:01.848679 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9476e4c7-3607-4bc3-90dd-ed599afc66d8-proxy-tls\") pod \"error-404-isvc-914c2-predictor-78956bf458-s7292\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:02.023098 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:02.023066 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:02.127714 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:02.127101 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:02.258392 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:02.255792 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292"] Apr 24 21:37:02.357285 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:02.357259 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz"] Apr 24 21:37:02.359691 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:37:02.359659 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49d0a4d9_3576_4587_ab0b_e2df7f7b45c8.slice/crio-cb17259bedb7213982338f4dbdea369628a465cbc93667e99c603b37224105b3 WatchSource:0}: Error finding container cb17259bedb7213982338f4dbdea369628a465cbc93667e99c603b37224105b3: Status 404 returned error can't find the container with id cb17259bedb7213982338f4dbdea369628a465cbc93667e99c603b37224105b3 Apr 24 21:37:02.367484 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:02.367017 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" event={"ID":"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8","Type":"ContainerStarted","Data":"cb17259bedb7213982338f4dbdea369628a465cbc93667e99c603b37224105b3"} Apr 24 21:37:02.369153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:02.369121 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" event={"ID":"9476e4c7-3607-4bc3-90dd-ed599afc66d8","Type":"ContainerStarted","Data":"f84c761fde5e99238a3a8d2172bb75c138f8103ade292372638878749d154c13"} Apr 24 21:37:06.378466 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.378422 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-75b584996b-2qpgz" podUID="d3dcfe4f-f129-4a28-98dd-db338ee798cd" containerName="console" containerID="cri-o://3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528" gracePeriod=15 Apr 24 21:37:06.773522 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.769122 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75b584996b-2qpgz_d3dcfe4f-f129-4a28-98dd-db338ee798cd/console/0.log" Apr 24 21:37:06.773522 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.769202 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:37:06.801875 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.801840 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-config\") pod \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " Apr 24 21:37:06.802060 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.801896 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-service-ca\") pod \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " Apr 24 21:37:06.802060 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.801947 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-serving-cert\") pod \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " Apr 24 21:37:06.802060 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.801984 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkgwc\" (UniqueName: \"kubernetes.io/projected/d3dcfe4f-f129-4a28-98dd-db338ee798cd-kube-api-access-mkgwc\") pod \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " Apr 24 21:37:06.802060 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.802014 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-oauth-config\") pod \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " Apr 24 21:37:06.802275 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.802077 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-oauth-serving-cert\") pod \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " Apr 24 21:37:06.802275 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.802102 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-trusted-ca-bundle\") pod \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\" (UID: \"d3dcfe4f-f129-4a28-98dd-db338ee798cd\") " Apr 24 21:37:06.803164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.803129 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d3dcfe4f-f129-4a28-98dd-db338ee798cd" (UID: "d3dcfe4f-f129-4a28-98dd-db338ee798cd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:06.804103 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.804074 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-service-ca" (OuterVolumeSpecName: "service-ca") pod "d3dcfe4f-f129-4a28-98dd-db338ee798cd" (UID: "d3dcfe4f-f129-4a28-98dd-db338ee798cd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:06.804248 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.804221 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d3dcfe4f-f129-4a28-98dd-db338ee798cd" (UID: "d3dcfe4f-f129-4a28-98dd-db338ee798cd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:06.804371 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.804353 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-config" (OuterVolumeSpecName: "console-config") pod "d3dcfe4f-f129-4a28-98dd-db338ee798cd" (UID: "d3dcfe4f-f129-4a28-98dd-db338ee798cd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:06.811985 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.811943 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d3dcfe4f-f129-4a28-98dd-db338ee798cd" (UID: "d3dcfe4f-f129-4a28-98dd-db338ee798cd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:06.819848 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.819812 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d3dcfe4f-f129-4a28-98dd-db338ee798cd" (UID: "d3dcfe4f-f129-4a28-98dd-db338ee798cd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:06.825344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.825299 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3dcfe4f-f129-4a28-98dd-db338ee798cd-kube-api-access-mkgwc" (OuterVolumeSpecName: "kube-api-access-mkgwc") pod "d3dcfe4f-f129-4a28-98dd-db338ee798cd" (UID: "d3dcfe4f-f129-4a28-98dd-db338ee798cd"). InnerVolumeSpecName "kube-api-access-mkgwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:06.903131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.902935 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:37:06.903131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.902967 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-service-ca\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:37:06.903131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.902982 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-serving-cert\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:37:06.903131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.902997 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mkgwc\" (UniqueName: \"kubernetes.io/projected/d3dcfe4f-f129-4a28-98dd-db338ee798cd-kube-api-access-mkgwc\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:37:06.903131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.903013 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3dcfe4f-f129-4a28-98dd-db338ee798cd-console-oauth-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:37:06.903131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.903027 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-oauth-serving-cert\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:37:06.903131 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:06.903041 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3dcfe4f-f129-4a28-98dd-db338ee798cd-trusted-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:37:07.415531 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:07.415497 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75b584996b-2qpgz_d3dcfe4f-f129-4a28-98dd-db338ee798cd/console/0.log" Apr 24 21:37:07.416026 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:07.415554 2573 generic.go:358] "Generic (PLEG): container finished" podID="d3dcfe4f-f129-4a28-98dd-db338ee798cd" containerID="3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528" exitCode=2 Apr 24 21:37:07.416026 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:07.415613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75b584996b-2qpgz" event={"ID":"d3dcfe4f-f129-4a28-98dd-db338ee798cd","Type":"ContainerDied","Data":"3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528"} Apr 24 21:37:07.416026 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:07.415643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75b584996b-2qpgz" event={"ID":"d3dcfe4f-f129-4a28-98dd-db338ee798cd","Type":"ContainerDied","Data":"776181c1010f7d77a0254fadcbc7c1ebc46ccf3b0eec304bc6764df5328a1be7"} Apr 24 21:37:07.416026 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:07.415665 2573 scope.go:117] "RemoveContainer" containerID="3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528" Apr 24 21:37:07.416026 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:07.415842 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75b584996b-2qpgz" Apr 24 21:37:07.435277 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:07.435223 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75b584996b-2qpgz"] Apr 24 21:37:07.438394 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:07.438346 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-75b584996b-2qpgz"] Apr 24 21:37:09.341882 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:09.341731 2573 scope.go:117] "RemoveContainer" containerID="3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528" Apr 24 21:37:09.342230 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:37:09.342154 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528\": container with ID starting with 3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528 not found: ID does not exist" containerID="3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528" Apr 24 21:37:09.342230 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:09.342196 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528"} err="failed to get container status \"3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528\": rpc error: code = NotFound desc = could not find container \"3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528\": container with ID starting with 3156d1b8b59db1f8153d877b56580eecb489406c37e9bcad98c8012520314528 not found: ID does not exist" Apr 24 21:37:09.372115 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:09.372076 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3dcfe4f-f129-4a28-98dd-db338ee798cd" path="/var/lib/kubelet/pods/d3dcfe4f-f129-4a28-98dd-db338ee798cd/volumes" Apr 24 21:37:18.465720 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:18.465623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" event={"ID":"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8","Type":"ContainerStarted","Data":"c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a"} Apr 24 21:37:18.466957 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:18.466906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" event={"ID":"9476e4c7-3607-4bc3-90dd-ed599afc66d8","Type":"ContainerStarted","Data":"82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05"} Apr 24 21:37:21.486770 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:21.486726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" event={"ID":"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8","Type":"ContainerStarted","Data":"1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff"} Apr 24 21:37:21.487284 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:21.486864 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:21.488691 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:21.488660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" event={"ID":"9476e4c7-3607-4bc3-90dd-ed599afc66d8","Type":"ContainerStarted","Data":"5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30"} Apr 24 21:37:21.488814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:21.488791 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:21.506111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:21.506066 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podStartSLOduration=1.939085057 podStartE2EDuration="20.506053365s" podCreationTimestamp="2026-04-24 21:37:01 +0000 UTC" firstStartedPulling="2026-04-24 21:37:02.361562958 +0000 UTC m=+607.641410716" lastFinishedPulling="2026-04-24 21:37:20.928531266 +0000 UTC m=+626.208379024" observedRunningTime="2026-04-24 21:37:21.504383838 +0000 UTC m=+626.784231617" watchObservedRunningTime="2026-04-24 21:37:21.506053365 +0000 UTC m=+626.785901145" Apr 24 21:37:21.521638 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:21.521589 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podStartSLOduration=1.8642923040000001 podStartE2EDuration="20.521575166s" podCreationTimestamp="2026-04-24 21:37:01 +0000 UTC" firstStartedPulling="2026-04-24 21:37:02.263607619 +0000 UTC m=+607.543455380" lastFinishedPulling="2026-04-24 21:37:20.920890481 +0000 UTC m=+626.200738242" observedRunningTime="2026-04-24 21:37:21.520374689 +0000 UTC m=+626.800222469" watchObservedRunningTime="2026-04-24 21:37:21.521575166 +0000 UTC m=+626.801423004" Apr 24 21:37:22.491976 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:22.491911 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:22.491976 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:22.491972 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:22.493062 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:22.492985 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:37:22.493178 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:22.492985 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:37:23.495645 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:23.495598 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:37:23.496045 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:23.495598 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:37:28.500667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:28.500636 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:37:28.501165 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:28.501114 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:37:28.501215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:28.501192 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:37:28.501504 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:28.501479 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:37:38.501343 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:38.501302 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:37:38.501782 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:38.501438 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:37:48.501348 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:48.501306 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:37:48.501737 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:48.501459 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:37:58.501883 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:58.501847 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:37:58.502309 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:58.501844 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:38:08.502054 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:08.502022 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:38:08.502466 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:08.502077 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:38:31.135183 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.135148 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292"] Apr 24 21:38:31.135776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.135510 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" containerID="cri-o://82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05" gracePeriod=30 Apr 24 21:38:31.135776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.135591 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kube-rbac-proxy" containerID="cri-o://5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30" gracePeriod=30 Apr 24 21:38:31.180339 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.180308 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz"] Apr 24 21:38:31.180660 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.180621 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" containerID="cri-o://c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a" gracePeriod=30 Apr 24 21:38:31.180773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.180679 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kube-rbac-proxy" containerID="cri-o://1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff" gracePeriod=30 Apr 24 21:38:31.234939 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.234884 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb"] Apr 24 21:38:31.235290 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.235277 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3dcfe4f-f129-4a28-98dd-db338ee798cd" containerName="console" Apr 24 21:38:31.235336 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.235291 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dcfe4f-f129-4a28-98dd-db338ee798cd" containerName="console" Apr 24 21:38:31.235381 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.235372 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3dcfe4f-f129-4a28-98dd-db338ee798cd" containerName="console" Apr 24 21:38:31.238938 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.238906 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.241224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.241202 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c12fa-kube-rbac-proxy-sar-config\"" Apr 24 21:38:31.241334 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.241206 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c12fa-predictor-serving-cert\"" Apr 24 21:38:31.247511 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.247290 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb"] Apr 24 21:38:31.265534 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.265507 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-proxy-tls\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.265692 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.265556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4h57\" (UniqueName: \"kubernetes.io/projected/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-kube-api-access-w4h57\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.265692 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.265615 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-success-200-isvc-c12fa-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.317258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.317232 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h"] Apr 24 21:38:31.320546 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.320531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.322736 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.322710 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c12fa-predictor-serving-cert\"" Apr 24 21:38:31.322863 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.322832 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c12fa-kube-rbac-proxy-sar-config\"" Apr 24 21:38:31.331095 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.331075 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h"] Apr 24 21:38:31.366200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.366167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4h57\" (UniqueName: \"kubernetes.io/projected/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-kube-api-access-w4h57\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.366368 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.366225 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-success-200-isvc-c12fa-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.366368 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.366262 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-proxy-tls\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.366368 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.366289 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8vlf\" (UniqueName: \"kubernetes.io/projected/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-kube-api-access-z8vlf\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.366575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.366368 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-error-404-isvc-c12fa-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.366575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.366473 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-proxy-tls\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.366686 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:38:31.366582 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-serving-cert: secret "success-200-isvc-c12fa-predictor-serving-cert" not found Apr 24 21:38:31.366686 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:38:31.366638 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-proxy-tls podName:c57b7cd6-ea5d-4873-860c-1506f1b5eab9 nodeName:}" failed. No retries permitted until 2026-04-24 21:38:31.866619102 +0000 UTC m=+697.146466875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-proxy-tls") pod "success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" (UID: "c57b7cd6-ea5d-4873-860c-1506f1b5eab9") : secret "success-200-isvc-c12fa-predictor-serving-cert" not found Apr 24 21:38:31.367246 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.367216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-success-200-isvc-c12fa-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.374783 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.374761 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4h57\" (UniqueName: \"kubernetes.io/projected/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-kube-api-access-w4h57\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.466977 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.466869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-proxy-tls\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.466977 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.466904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8vlf\" (UniqueName: \"kubernetes.io/projected/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-kube-api-access-z8vlf\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.466977 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.466969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-error-404-isvc-c12fa-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.467257 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:38:31.467039 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-serving-cert: secret "error-404-isvc-c12fa-predictor-serving-cert" not found Apr 24 21:38:31.467257 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:38:31.467111 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-proxy-tls podName:e6b72a15-e99f-4e4c-98c3-6e87f0b7458d nodeName:}" failed. No retries permitted until 2026-04-24 21:38:31.967091016 +0000 UTC m=+697.246938773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-proxy-tls") pod "error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" (UID: "e6b72a15-e99f-4e4c-98c3-6e87f0b7458d") : secret "error-404-isvc-c12fa-predictor-serving-cert" not found Apr 24 21:38:31.467859 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.467837 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-error-404-isvc-c12fa-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.475141 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.475111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8vlf\" (UniqueName: \"kubernetes.io/projected/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-kube-api-access-z8vlf\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.725400 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.725317 2573 generic.go:358] "Generic (PLEG): container finished" podID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerID="1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff" exitCode=2 Apr 24 21:38:31.725400 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.725384 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" event={"ID":"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8","Type":"ContainerDied","Data":"1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff"} Apr 24 21:38:31.726868 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.726846 2573 generic.go:358] "Generic (PLEG): container finished" podID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerID="5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30" exitCode=2 Apr 24 21:38:31.726990 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.726886 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" event={"ID":"9476e4c7-3607-4bc3-90dd-ed599afc66d8","Type":"ContainerDied","Data":"5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30"} Apr 24 21:38:31.870123 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.870087 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-proxy-tls\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.872397 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.872367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-proxy-tls\") pod \"success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:31.971051 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.971023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-proxy-tls\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:31.973361 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:31.973336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-proxy-tls\") pod \"error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:32.152823 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.152791 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:32.234317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.234268 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:32.285009 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.284945 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb"] Apr 24 21:38:32.288985 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:38:32.288954 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc57b7cd6_ea5d_4873_860c_1506f1b5eab9.slice/crio-64c8bd8db611663dd528c3b7018e38710929088b4b37cd07b489575568f113f1 WatchSource:0}: Error finding container 64c8bd8db611663dd528c3b7018e38710929088b4b37cd07b489575568f113f1: Status 404 returned error can't find the container with id 64c8bd8db611663dd528c3b7018e38710929088b4b37cd07b489575568f113f1 Apr 24 21:38:32.291152 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.291119 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:38:32.369979 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.369955 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h"] Apr 24 21:38:32.372384 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:38:32.372350 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6b72a15_e99f_4e4c_98c3_6e87f0b7458d.slice/crio-e605dc96ad28ac8f0883badc848b9587b897e71cc2ed823682e7b9b72f117d5b WatchSource:0}: Error finding container e605dc96ad28ac8f0883badc848b9587b897e71cc2ed823682e7b9b72f117d5b: Status 404 returned error can't find the container with id e605dc96ad28ac8f0883badc848b9587b897e71cc2ed823682e7b9b72f117d5b Apr 24 21:38:32.732430 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.732333 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" event={"ID":"c57b7cd6-ea5d-4873-860c-1506f1b5eab9","Type":"ContainerStarted","Data":"96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db"} Apr 24 21:38:32.732430 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.732378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" event={"ID":"c57b7cd6-ea5d-4873-860c-1506f1b5eab9","Type":"ContainerStarted","Data":"9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46"} Apr 24 21:38:32.732430 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.732393 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" event={"ID":"c57b7cd6-ea5d-4873-860c-1506f1b5eab9","Type":"ContainerStarted","Data":"64c8bd8db611663dd528c3b7018e38710929088b4b37cd07b489575568f113f1"} Apr 24 21:38:32.732707 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.732522 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:32.732707 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.732552 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:32.734074 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.734046 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" event={"ID":"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d","Type":"ContainerStarted","Data":"a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d"} Apr 24 21:38:32.734202 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.734079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" event={"ID":"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d","Type":"ContainerStarted","Data":"acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19"} Apr 24 21:38:32.734202 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.734094 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" event={"ID":"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d","Type":"ContainerStarted","Data":"e605dc96ad28ac8f0883badc848b9587b897e71cc2ed823682e7b9b72f117d5b"} Apr 24 21:38:32.734202 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.734144 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:38:32.734353 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.734322 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:32.734353 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.734346 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:32.735184 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.735165 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:38:32.751650 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.751602 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podStartSLOduration=1.751589467 podStartE2EDuration="1.751589467s" podCreationTimestamp="2026-04-24 21:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:38:32.749571652 +0000 UTC m=+698.029419433" watchObservedRunningTime="2026-04-24 21:38:32.751589467 +0000 UTC m=+698.031437249" Apr 24 21:38:32.767358 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:32.767316 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podStartSLOduration=1.767303608 podStartE2EDuration="1.767303608s" podCreationTimestamp="2026-04-24 21:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:38:32.765552281 +0000 UTC m=+698.045400073" watchObservedRunningTime="2026-04-24 21:38:32.767303608 +0000 UTC m=+698.047151388" Apr 24 21:38:33.495720 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:33.495674 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 24 21:38:33.496202 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:33.495691 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 24 21:38:33.737620 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:33.737574 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:38:33.737620 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:33.737603 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:38:34.505999 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.505971 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:38:34.595518 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.595482 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf9cq\" (UniqueName: \"kubernetes.io/projected/9476e4c7-3607-4bc3-90dd-ed599afc66d8-kube-api-access-xf9cq\") pod \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " Apr 24 21:38:34.595692 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.595668 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9476e4c7-3607-4bc3-90dd-ed599afc66d8-proxy-tls\") pod \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " Apr 24 21:38:34.595773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.595757 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9476e4c7-3607-4bc3-90dd-ed599afc66d8-error-404-isvc-914c2-kube-rbac-proxy-sar-config\") pod \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\" (UID: \"9476e4c7-3607-4bc3-90dd-ed599afc66d8\") " Apr 24 21:38:34.596073 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.596051 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9476e4c7-3607-4bc3-90dd-ed599afc66d8-error-404-isvc-914c2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-914c2-kube-rbac-proxy-sar-config") pod "9476e4c7-3607-4bc3-90dd-ed599afc66d8" (UID: "9476e4c7-3607-4bc3-90dd-ed599afc66d8"). InnerVolumeSpecName "error-404-isvc-914c2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:38:34.597676 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.597655 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9476e4c7-3607-4bc3-90dd-ed599afc66d8-kube-api-access-xf9cq" (OuterVolumeSpecName: "kube-api-access-xf9cq") pod "9476e4c7-3607-4bc3-90dd-ed599afc66d8" (UID: "9476e4c7-3607-4bc3-90dd-ed599afc66d8"). InnerVolumeSpecName "kube-api-access-xf9cq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:38:34.597739 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.597703 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9476e4c7-3607-4bc3-90dd-ed599afc66d8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9476e4c7-3607-4bc3-90dd-ed599afc66d8" (UID: "9476e4c7-3607-4bc3-90dd-ed599afc66d8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:38:34.629400 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.629378 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:38:34.696688 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.696659 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zdnp\" (UniqueName: \"kubernetes.io/projected/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-kube-api-access-6zdnp\") pod \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " Apr 24 21:38:34.696884 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.696699 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-proxy-tls\") pod \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " Apr 24 21:38:34.696884 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.696772 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-success-200-isvc-914c2-kube-rbac-proxy-sar-config\") pod \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\" (UID: \"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8\") " Apr 24 21:38:34.697079 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.697031 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xf9cq\" (UniqueName: \"kubernetes.io/projected/9476e4c7-3607-4bc3-90dd-ed599afc66d8-kube-api-access-xf9cq\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:38:34.697079 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.697054 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9476e4c7-3607-4bc3-90dd-ed599afc66d8-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:38:34.697079 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.697069 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9476e4c7-3607-4bc3-90dd-ed599afc66d8-error-404-isvc-914c2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:38:34.697231 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.697158 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-success-200-isvc-914c2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-914c2-kube-rbac-proxy-sar-config") pod "49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" (UID: "49d0a4d9-3576-4587-ab0b-e2df7f7b45c8"). InnerVolumeSpecName "success-200-isvc-914c2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:38:34.698782 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.698761 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" (UID: "49d0a4d9-3576-4587-ab0b-e2df7f7b45c8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:38:34.698852 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.698803 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-kube-api-access-6zdnp" (OuterVolumeSpecName: "kube-api-access-6zdnp") pod "49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" (UID: "49d0a4d9-3576-4587-ab0b-e2df7f7b45c8"). InnerVolumeSpecName "kube-api-access-6zdnp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:38:34.742140 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.742113 2573 generic.go:358] "Generic (PLEG): container finished" podID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerID="c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a" exitCode=0 Apr 24 21:38:34.742275 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.742184 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" Apr 24 21:38:34.742275 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.742204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" event={"ID":"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8","Type":"ContainerDied","Data":"c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a"} Apr 24 21:38:34.742275 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.742252 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz" event={"ID":"49d0a4d9-3576-4587-ab0b-e2df7f7b45c8","Type":"ContainerDied","Data":"cb17259bedb7213982338f4dbdea369628a465cbc93667e99c603b37224105b3"} Apr 24 21:38:34.742275 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.742276 2573 scope.go:117] "RemoveContainer" containerID="1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff" Apr 24 21:38:34.743738 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.743659 2573 generic.go:358] "Generic (PLEG): container finished" podID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerID="82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05" exitCode=0 Apr 24 21:38:34.743738 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.743687 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" event={"ID":"9476e4c7-3607-4bc3-90dd-ed599afc66d8","Type":"ContainerDied","Data":"82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05"} Apr 24 21:38:34.743738 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.743715 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" event={"ID":"9476e4c7-3607-4bc3-90dd-ed599afc66d8","Type":"ContainerDied","Data":"f84c761fde5e99238a3a8d2172bb75c138f8103ade292372638878749d154c13"} Apr 24 21:38:34.743738 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.743731 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292" Apr 24 21:38:34.751834 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.751817 2573 scope.go:117] "RemoveContainer" containerID="c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a" Apr 24 21:38:34.759653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.759635 2573 scope.go:117] "RemoveContainer" containerID="1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff" Apr 24 21:38:34.759897 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:38:34.759881 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff\": container with ID starting with 1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff not found: ID does not exist" containerID="1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff" Apr 24 21:38:34.760673 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.759904 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff"} err="failed to get container status \"1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff\": rpc error: code = NotFound desc = could not find container \"1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff\": container with ID starting with 1f005788c749f39e3ce2670a152d8647553e15ecd0c0e8b29c0ea430a653abff not found: ID does not exist" Apr 24 21:38:34.760723 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.760682 2573 scope.go:117] "RemoveContainer" containerID="c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a" Apr 24 21:38:34.761060 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:38:34.761034 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a\": container with ID starting with c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a not found: ID does not exist" containerID="c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a" Apr 24 21:38:34.761116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.761066 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a"} err="failed to get container status \"c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a\": rpc error: code = NotFound desc = could not find container \"c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a\": container with ID starting with c748c461b16781788184ca94793595d4f6d375e20081abf505c955492446e19a not found: ID does not exist" Apr 24 21:38:34.761116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.761082 2573 scope.go:117] "RemoveContainer" containerID="5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30" Apr 24 21:38:34.768833 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.768814 2573 scope.go:117] "RemoveContainer" containerID="82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05" Apr 24 21:38:34.769769 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.769752 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292"] Apr 24 21:38:34.773055 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.773035 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292"] Apr 24 21:38:34.776470 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.776455 2573 scope.go:117] "RemoveContainer" containerID="5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30" Apr 24 21:38:34.776698 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:38:34.776681 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30\": container with ID starting with 5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30 not found: ID does not exist" containerID="5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30" Apr 24 21:38:34.776745 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.776703 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30"} err="failed to get container status \"5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30\": rpc error: code = NotFound desc = could not find container \"5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30\": container with ID starting with 5ec0633056e122d2d9520179265c4948c1d11e1f45ca6c9eba5b060a587a9b30 not found: ID does not exist" Apr 24 21:38:34.776745 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.776720 2573 scope.go:117] "RemoveContainer" containerID="82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05" Apr 24 21:38:34.776960 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:38:34.776939 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05\": container with ID starting with 82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05 not found: ID does not exist" containerID="82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05" Apr 24 21:38:34.777013 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.776972 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05"} err="failed to get container status \"82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05\": rpc error: code = NotFound desc = could not find container \"82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05\": container with ID starting with 82dc6747624ce52e95e8f8bf4716115335d19b36480427b0faa579d8868e5a05 not found: ID does not exist" Apr 24 21:38:34.782588 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.782567 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz"] Apr 24 21:38:34.787763 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.787739 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz"] Apr 24 21:38:34.797484 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.797464 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-914c2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-success-200-isvc-914c2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:38:34.797571 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.797487 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6zdnp\" (UniqueName: \"kubernetes.io/projected/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-kube-api-access-6zdnp\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:38:34.797571 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:34.797502 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:38:35.368768 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:35.368737 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" path="/var/lib/kubelet/pods/49d0a4d9-3576-4587-ab0b-e2df7f7b45c8/volumes" Apr 24 21:38:35.369168 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:35.369154 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" path="/var/lib/kubelet/pods/9476e4c7-3607-4bc3-90dd-ed599afc66d8/volumes" Apr 24 21:38:38.742079 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:38.742048 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:38:38.742450 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:38.742255 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:38:38.742655 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:38.742632 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:38:38.742729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:38.742626 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:38:48.743188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:48.743100 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:38:48.743551 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:48.743101 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:38:58.743526 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:58.743479 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:38:58.743931 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:38:58.743479 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:39:08.743243 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:08.743196 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:39:08.743734 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:08.743200 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:39:11.146095 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146057 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6"] Apr 24 21:39:11.146652 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146636 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" Apr 24 21:39:11.146699 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146656 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" Apr 24 21:39:11.146699 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146683 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kube-rbac-proxy" Apr 24 21:39:11.146699 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146691 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kube-rbac-proxy" Apr 24 21:39:11.146789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146706 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" Apr 24 21:39:11.146789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146715 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" Apr 24 21:39:11.146789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146733 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kube-rbac-proxy" Apr 24 21:39:11.146789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146741 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kube-rbac-proxy" Apr 24 21:39:11.146904 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146837 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kube-rbac-proxy" Apr 24 21:39:11.146904 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146852 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kserve-container" Apr 24 21:39:11.146904 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146866 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9476e4c7-3607-4bc3-90dd-ed599afc66d8" containerName="kserve-container" Apr 24 21:39:11.146904 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.146877 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="49d0a4d9-3576-4587-ab0b-e2df7f7b45c8" containerName="kube-rbac-proxy" Apr 24 21:39:11.151876 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.151853 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.154237 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.154210 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7fc99-kube-rbac-proxy-sar-config\"" Apr 24 21:39:11.154368 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.154210 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7fc99-predictor-serving-cert\"" Apr 24 21:39:11.157938 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.157899 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6"] Apr 24 21:39:11.254954 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.254908 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n"] Apr 24 21:39:11.258602 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.258577 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:11.261127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.261097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7fc99-predictor-serving-cert\"" Apr 24 21:39:11.261127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.261117 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7fc99-kube-rbac-proxy-sar-config\"" Apr 24 21:39:11.267342 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.267317 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n"] Apr 24 21:39:11.312757 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.312723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a382bf3-3608-40fd-ab84-8604e941a8c3-proxy-tls\") pod \"success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.312936 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.312779 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a382bf3-3608-40fd-ab84-8604e941a8c3-success-200-isvc-7fc99-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.312936 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.312804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zlc4\" (UniqueName: \"kubernetes.io/projected/6a382bf3-3608-40fd-ab84-8604e941a8c3-kube-api-access-4zlc4\") pod \"success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.413512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.413433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a382bf3-3608-40fd-ab84-8604e941a8c3-success-200-isvc-7fc99-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.413512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.413478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zlc4\" (UniqueName: \"kubernetes.io/projected/6a382bf3-3608-40fd-ab84-8604e941a8c3-kube-api-access-4zlc4\") pod \"success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.413731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.413535 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-error-404-isvc-7fc99-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:11.413731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.413579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2cws\" (UniqueName: \"kubernetes.io/projected/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-kube-api-access-g2cws\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:11.413731 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.413691 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a382bf3-3608-40fd-ab84-8604e941a8c3-proxy-tls\") pod \"success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.413871 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.413759 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-proxy-tls\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:11.414141 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.414116 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a382bf3-3608-40fd-ab84-8604e941a8c3-success-200-isvc-7fc99-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.416282 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.416264 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a382bf3-3608-40fd-ab84-8604e941a8c3-proxy-tls\") pod \"success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.422015 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.421995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zlc4\" (UniqueName: \"kubernetes.io/projected/6a382bf3-3608-40fd-ab84-8604e941a8c3-kube-api-access-4zlc4\") pod \"success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.464996 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.464966 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.515003 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.514964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-proxy-tls\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:11.515148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.515032 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-error-404-isvc-7fc99-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:11.515148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.515060 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2cws\" (UniqueName: \"kubernetes.io/projected/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-kube-api-access-g2cws\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:11.515403 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:39:11.515386 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-serving-cert: secret "error-404-isvc-7fc99-predictor-serving-cert" not found Apr 24 21:39:11.515466 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:39:11.515449 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-proxy-tls podName:aa940d38-3c6c-4e99-8e9b-95aa2b971edb nodeName:}" failed. No retries permitted until 2026-04-24 21:39:12.015433022 +0000 UTC m=+737.295280780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-proxy-tls") pod "error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" (UID: "aa940d38-3c6c-4e99-8e9b-95aa2b971edb") : secret "error-404-isvc-7fc99-predictor-serving-cert" not found Apr 24 21:39:11.516260 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.516233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-error-404-isvc-7fc99-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:11.524788 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.524738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2cws\" (UniqueName: \"kubernetes.io/projected/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-kube-api-access-g2cws\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:11.591881 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.591852 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6"] Apr 24 21:39:11.593606 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:39:11.593575 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a382bf3_3608_40fd_ab84_8604e941a8c3.slice/crio-7ed1a9d6f7e537dce689669e60d2debb9f161945e1515fe6b011fb470898d6a7 WatchSource:0}: Error finding container 7ed1a9d6f7e537dce689669e60d2debb9f161945e1515fe6b011fb470898d6a7: Status 404 returned error can't find the container with id 7ed1a9d6f7e537dce689669e60d2debb9f161945e1515fe6b011fb470898d6a7 Apr 24 21:39:11.875499 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.875461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" event={"ID":"6a382bf3-3608-40fd-ab84-8604e941a8c3","Type":"ContainerStarted","Data":"950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986"} Apr 24 21:39:11.875499 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.875504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" event={"ID":"6a382bf3-3608-40fd-ab84-8604e941a8c3","Type":"ContainerStarted","Data":"4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80"} Apr 24 21:39:11.875742 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.875517 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" event={"ID":"6a382bf3-3608-40fd-ab84-8604e941a8c3","Type":"ContainerStarted","Data":"7ed1a9d6f7e537dce689669e60d2debb9f161945e1515fe6b011fb470898d6a7"} Apr 24 21:39:11.875742 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.875634 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.875836 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.875766 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:11.877070 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.877047 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:39:11.924645 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:11.924547 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podStartSLOduration=0.924531674 podStartE2EDuration="924.531674ms" podCreationTimestamp="2026-04-24 21:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:11.92282334 +0000 UTC m=+737.202671133" watchObservedRunningTime="2026-04-24 21:39:11.924531674 +0000 UTC m=+737.204379453" Apr 24 21:39:12.019430 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.019389 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-proxy-tls\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:12.021805 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.021784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-proxy-tls\") pod \"error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:12.171421 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.171379 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:12.298548 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.298520 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n"] Apr 24 21:39:12.300230 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:39:12.300197 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa940d38_3c6c_4e99_8e9b_95aa2b971edb.slice/crio-11e6e0c24da90a2e004817d3e574b0d02295ba9d99a8a740ad013ee028aa643f WatchSource:0}: Error finding container 11e6e0c24da90a2e004817d3e574b0d02295ba9d99a8a740ad013ee028aa643f: Status 404 returned error can't find the container with id 11e6e0c24da90a2e004817d3e574b0d02295ba9d99a8a740ad013ee028aa643f Apr 24 21:39:12.881453 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.881407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" event={"ID":"aa940d38-3c6c-4e99-8e9b-95aa2b971edb","Type":"ContainerStarted","Data":"7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d"} Apr 24 21:39:12.881453 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.881460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" event={"ID":"aa940d38-3c6c-4e99-8e9b-95aa2b971edb","Type":"ContainerStarted","Data":"b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d"} Apr 24 21:39:12.881710 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.881475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" event={"ID":"aa940d38-3c6c-4e99-8e9b-95aa2b971edb","Type":"ContainerStarted","Data":"11e6e0c24da90a2e004817d3e574b0d02295ba9d99a8a740ad013ee028aa643f"} Apr 24 21:39:12.881710 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.881557 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:12.882007 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.881974 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:39:12.900216 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:12.900166 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podStartSLOduration=1.900151871 podStartE2EDuration="1.900151871s" podCreationTimestamp="2026-04-24 21:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:12.899505738 +0000 UTC m=+738.179353518" watchObservedRunningTime="2026-04-24 21:39:12.900151871 +0000 UTC m=+738.179999650" Apr 24 21:39:13.886207 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:13.886173 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:13.888012 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:13.887974 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:14.889787 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:14.889749 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:17.887908 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:17.887875 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:17.888447 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:17.888419 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:39:18.743951 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:18.743902 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:39:18.744163 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:18.744145 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:39:19.894117 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:19.894087 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:39:19.894614 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:19.894590 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:27.888551 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:27.888501 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:39:29.894642 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:29.894601 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:37.889129 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:37.889071 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:39:39.895043 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:39.895003 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:47.888859 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:47.888795 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:39:49.895138 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:49.895091 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:39:57.889105 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:57.889073 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:39:59.895503 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:39:59.895467 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:41:55.294984 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:41:55.294874 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:41:55.295522 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:41:55.295398 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:46:55.319904 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:55.319877 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:46:55.322013 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:55.321991 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:47:46.089109 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.089031 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb"] Apr 24 21:47:46.090106 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.090048 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" containerID="cri-o://9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46" gracePeriod=30 Apr 24 21:47:46.090262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.090087 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kube-rbac-proxy" containerID="cri-o://96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db" gracePeriod=30 Apr 24 21:47:46.156248 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.156211 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h"] Apr 24 21:47:46.156519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.156493 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" containerID="cri-o://acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19" gracePeriod=30 Apr 24 21:47:46.156591 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.156537 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kube-rbac-proxy" containerID="cri-o://a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d" gracePeriod=30 Apr 24 21:47:46.167729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.167702 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq"] Apr 24 21:47:46.170488 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.170470 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.172561 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.172540 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2525b-predictor-serving-cert\"" Apr 24 21:47:46.172658 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.172621 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2525b-kube-rbac-proxy-sar-config\"" Apr 24 21:47:46.191148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.191111 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq"] Apr 24 21:47:46.215823 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.215791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwvvg\" (UniqueName: \"kubernetes.io/projected/550706c4-48b4-456b-bd40-364ec1b3d86d-kube-api-access-kwvvg\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.215977 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.215900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/550706c4-48b4-456b-bd40-364ec1b3d86d-proxy-tls\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.216165 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.216020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/550706c4-48b4-456b-bd40-364ec1b3d86d-success-200-isvc-2525b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.243044 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.243017 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277"] Apr 24 21:47:46.245563 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.245548 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.247747 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.247716 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2525b-predictor-serving-cert\"" Apr 24 21:47:46.247971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.247949 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2525b-kube-rbac-proxy-sar-config\"" Apr 24 21:47:46.257182 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.257157 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277"] Apr 24 21:47:46.317524 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.317492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-error-404-isvc-2525b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.317674 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.317548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwvvg\" (UniqueName: \"kubernetes.io/projected/550706c4-48b4-456b-bd40-364ec1b3d86d-kube-api-access-kwvvg\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.317674 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.317588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkml8\" (UniqueName: \"kubernetes.io/projected/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-kube-api-access-hkml8\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.317674 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.317611 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/550706c4-48b4-456b-bd40-364ec1b3d86d-proxy-tls\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.317838 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.317673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/550706c4-48b4-456b-bd40-364ec1b3d86d-success-200-isvc-2525b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.317838 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.317714 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-proxy-tls\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.317838 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:46.317773 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-2525b-predictor-serving-cert: secret "success-200-isvc-2525b-predictor-serving-cert" not found Apr 24 21:47:46.317838 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:46.317827 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/550706c4-48b4-456b-bd40-364ec1b3d86d-proxy-tls podName:550706c4-48b4-456b-bd40-364ec1b3d86d nodeName:}" failed. No retries permitted until 2026-04-24 21:47:46.817807791 +0000 UTC m=+1252.097655551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/550706c4-48b4-456b-bd40-364ec1b3d86d-proxy-tls") pod "success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" (UID: "550706c4-48b4-456b-bd40-364ec1b3d86d") : secret "success-200-isvc-2525b-predictor-serving-cert" not found Apr 24 21:47:46.318375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.318350 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/550706c4-48b4-456b-bd40-364ec1b3d86d-success-200-isvc-2525b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.326555 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.326531 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwvvg\" (UniqueName: \"kubernetes.io/projected/550706c4-48b4-456b-bd40-364ec1b3d86d-kube-api-access-kwvvg\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.419011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.418905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-error-404-isvc-2525b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.419011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.419001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkml8\" (UniqueName: \"kubernetes.io/projected/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-kube-api-access-hkml8\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.419250 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.419059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-proxy-tls\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.419250 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:46.419174 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-2525b-predictor-serving-cert: secret "error-404-isvc-2525b-predictor-serving-cert" not found Apr 24 21:47:46.419250 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:46.419231 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-proxy-tls podName:1dfe4c57-e300-4bce-88f5-2f3d90e17cc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:47:46.919212668 +0000 UTC m=+1252.199060429 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-proxy-tls") pod "error-404-isvc-2525b-predictor-76f74d576d-kh277" (UID: "1dfe4c57-e300-4bce-88f5-2f3d90e17cc0") : secret "error-404-isvc-2525b-predictor-serving-cert" not found Apr 24 21:47:46.419670 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.419649 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-error-404-isvc-2525b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.427796 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.427775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkml8\" (UniqueName: \"kubernetes.io/projected/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-kube-api-access-hkml8\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.642674 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.642638 2573 generic.go:358] "Generic (PLEG): container finished" podID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerID="a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d" exitCode=2 Apr 24 21:47:46.642674 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.642671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" event={"ID":"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d","Type":"ContainerDied","Data":"a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d"} Apr 24 21:47:46.644220 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.644200 2573 generic.go:358] "Generic (PLEG): container finished" podID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerID="96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db" exitCode=2 Apr 24 21:47:46.644317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.644236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" event={"ID":"c57b7cd6-ea5d-4873-860c-1506f1b5eab9","Type":"ContainerDied","Data":"96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db"} Apr 24 21:47:46.822866 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.822830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/550706c4-48b4-456b-bd40-364ec1b3d86d-proxy-tls\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.825339 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.825309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/550706c4-48b4-456b-bd40-364ec1b3d86d-proxy-tls\") pod \"success-200-isvc-2525b-predictor-67d9995cb7-lcmhq\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:46.923756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.923721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-proxy-tls\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:46.926111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:46.926087 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-proxy-tls\") pod \"error-404-isvc-2525b-predictor-76f74d576d-kh277\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:47.081108 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.081013 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:47.159701 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.159663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:47.214597 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.214511 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq"] Apr 24 21:47:47.222306 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.221902 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:47:47.298762 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.298612 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277"] Apr 24 21:47:47.300802 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:47:47.300774 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dfe4c57_e300_4bce_88f5_2f3d90e17cc0.slice/crio-bc07db888951388d49a4ec1bd7354cec888ef8d299015f6da7777584933d5d8c WatchSource:0}: Error finding container bc07db888951388d49a4ec1bd7354cec888ef8d299015f6da7777584933d5d8c: Status 404 returned error can't find the container with id bc07db888951388d49a4ec1bd7354cec888ef8d299015f6da7777584933d5d8c Apr 24 21:47:47.650586 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.650498 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" event={"ID":"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0","Type":"ContainerStarted","Data":"d7182e538620ee83c0d735dc9e1c27b12429a38b861d053e32369b731cddf2bf"} Apr 24 21:47:47.650586 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.650538 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" event={"ID":"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0","Type":"ContainerStarted","Data":"5ad4556e3e9d3f68efb89cc32b57f8fce569a523efdc621ba3103a2dbf9bad02"} Apr 24 21:47:47.650586 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.650550 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" event={"ID":"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0","Type":"ContainerStarted","Data":"bc07db888951388d49a4ec1bd7354cec888ef8d299015f6da7777584933d5d8c"} Apr 24 21:47:47.650930 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.650617 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:47.652054 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.652031 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" event={"ID":"550706c4-48b4-456b-bd40-364ec1b3d86d","Type":"ContainerStarted","Data":"cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f"} Apr 24 21:47:47.652153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.652056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" event={"ID":"550706c4-48b4-456b-bd40-364ec1b3d86d","Type":"ContainerStarted","Data":"799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c"} Apr 24 21:47:47.652153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.652070 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" event={"ID":"550706c4-48b4-456b-bd40-364ec1b3d86d","Type":"ContainerStarted","Data":"bb0f4f00b5dc15e40b2bb57047183349ce13473e717362a8df81af9763c552c3"} Apr 24 21:47:47.652220 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.652153 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:47.671067 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.671012 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podStartSLOduration=1.670994233 podStartE2EDuration="1.670994233s" podCreationTimestamp="2026-04-24 21:47:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:47.669133185 +0000 UTC m=+1252.948980963" watchObservedRunningTime="2026-04-24 21:47:47.670994233 +0000 UTC m=+1252.950842017" Apr 24 21:47:47.688209 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:47.688147 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podStartSLOduration=1.688126895 podStartE2EDuration="1.688126895s" podCreationTimestamp="2026-04-24 21:47:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:47.68597734 +0000 UTC m=+1252.965825125" watchObservedRunningTime="2026-04-24 21:47:47.688126895 +0000 UTC m=+1252.967974678" Apr 24 21:47:48.656179 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:48.656138 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:48.656592 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:48.656194 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:48.657171 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:48.657141 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:47:48.657278 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:48.657194 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:47:48.737902 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:48.737859 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 24 21:47:48.738097 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:48.737858 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 24 21:47:48.743201 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:48.743174 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:47:48.743321 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:48.743173 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:47:49.599056 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.599030 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:47:49.649456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.649377 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-error-404-isvc-c12fa-kube-rbac-proxy-sar-config\") pod \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " Apr 24 21:47:49.649456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.649430 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-proxy-tls\") pod \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " Apr 24 21:47:49.649736 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.649459 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8vlf\" (UniqueName: \"kubernetes.io/projected/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-kube-api-access-z8vlf\") pod \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\" (UID: \"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d\") " Apr 24 21:47:49.649802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.649740 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-error-404-isvc-c12fa-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-c12fa-kube-rbac-proxy-sar-config") pod "e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" (UID: "e6b72a15-e99f-4e4c-98c3-6e87f0b7458d"). InnerVolumeSpecName "error-404-isvc-c12fa-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:49.651537 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.651514 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" (UID: "e6b72a15-e99f-4e4c-98c3-6e87f0b7458d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:49.651626 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.651586 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-kube-api-access-z8vlf" (OuterVolumeSpecName: "kube-api-access-z8vlf") pod "e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" (UID: "e6b72a15-e99f-4e4c-98c3-6e87f0b7458d"). InnerVolumeSpecName "kube-api-access-z8vlf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:49.661524 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.661501 2573 generic.go:358] "Generic (PLEG): container finished" podID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerID="acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19" exitCode=0 Apr 24 21:47:49.661848 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.661577 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" Apr 24 21:47:49.661848 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.661598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" event={"ID":"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d","Type":"ContainerDied","Data":"acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19"} Apr 24 21:47:49.661848 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.661638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h" event={"ID":"e6b72a15-e99f-4e4c-98c3-6e87f0b7458d","Type":"ContainerDied","Data":"e605dc96ad28ac8f0883badc848b9587b897e71cc2ed823682e7b9b72f117d5b"} Apr 24 21:47:49.661848 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.661662 2573 scope.go:117] "RemoveContainer" containerID="a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d" Apr 24 21:47:49.662051 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.661974 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:47:49.662478 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.662453 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:47:49.670911 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.670896 2573 scope.go:117] "RemoveContainer" containerID="acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19" Apr 24 21:47:49.678554 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.678535 2573 scope.go:117] "RemoveContainer" containerID="a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d" Apr 24 21:47:49.678792 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:49.678773 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d\": container with ID starting with a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d not found: ID does not exist" containerID="a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d" Apr 24 21:47:49.678844 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.678799 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d"} err="failed to get container status \"a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d\": rpc error: code = NotFound desc = could not find container \"a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d\": container with ID starting with a27b7335bf14970ff5d1cbd1ebb4398eb6599edbcd3498b58d4e7879b0ddd51d not found: ID does not exist" Apr 24 21:47:49.678844 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.678814 2573 scope.go:117] "RemoveContainer" containerID="acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19" Apr 24 21:47:49.679033 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:49.679015 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19\": container with ID starting with acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19 not found: ID does not exist" containerID="acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19" Apr 24 21:47:49.679103 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.679039 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19"} err="failed to get container status \"acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19\": rpc error: code = NotFound desc = could not find container \"acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19\": container with ID starting with acc32e511d0b7a89aa977486f51b833f3a3db5c4bb0b407115a559405bf65c19 not found: ID does not exist" Apr 24 21:47:49.685062 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.685038 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h"] Apr 24 21:47:49.690440 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.690419 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h"] Apr 24 21:47:49.751218 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.751188 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-error-404-isvc-c12fa-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:47:49.751218 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.751215 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:47:49.751218 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.751226 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z8vlf\" (UniqueName: \"kubernetes.io/projected/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d-kube-api-access-z8vlf\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:47:49.917424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:49.917401 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:47:50.053370 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.053326 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4h57\" (UniqueName: \"kubernetes.io/projected/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-kube-api-access-w4h57\") pod \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " Apr 24 21:47:50.053559 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.053396 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-success-200-isvc-c12fa-kube-rbac-proxy-sar-config\") pod \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " Apr 24 21:47:50.053559 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.053537 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-proxy-tls\") pod \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\" (UID: \"c57b7cd6-ea5d-4873-860c-1506f1b5eab9\") " Apr 24 21:47:50.053769 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.053745 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-success-200-isvc-c12fa-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-c12fa-kube-rbac-proxy-sar-config") pod "c57b7cd6-ea5d-4873-860c-1506f1b5eab9" (UID: "c57b7cd6-ea5d-4873-860c-1506f1b5eab9"). InnerVolumeSpecName "success-200-isvc-c12fa-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:50.055414 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.055391 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-kube-api-access-w4h57" (OuterVolumeSpecName: "kube-api-access-w4h57") pod "c57b7cd6-ea5d-4873-860c-1506f1b5eab9" (UID: "c57b7cd6-ea5d-4873-860c-1506f1b5eab9"). InnerVolumeSpecName "kube-api-access-w4h57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:50.055414 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.055408 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c57b7cd6-ea5d-4873-860c-1506f1b5eab9" (UID: "c57b7cd6-ea5d-4873-860c-1506f1b5eab9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:50.154254 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.154205 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-c12fa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-success-200-isvc-c12fa-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:47:50.154254 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.154246 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:47:50.154254 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.154261 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4h57\" (UniqueName: \"kubernetes.io/projected/c57b7cd6-ea5d-4873-860c-1506f1b5eab9-kube-api-access-w4h57\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:47:50.670255 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.670173 2573 generic.go:358] "Generic (PLEG): container finished" podID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerID="9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46" exitCode=0 Apr 24 21:47:50.670255 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.670240 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" Apr 24 21:47:50.670255 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.670242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" event={"ID":"c57b7cd6-ea5d-4873-860c-1506f1b5eab9","Type":"ContainerDied","Data":"9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46"} Apr 24 21:47:50.670774 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.670271 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb" event={"ID":"c57b7cd6-ea5d-4873-860c-1506f1b5eab9","Type":"ContainerDied","Data":"64c8bd8db611663dd528c3b7018e38710929088b4b37cd07b489575568f113f1"} Apr 24 21:47:50.670774 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.670289 2573 scope.go:117] "RemoveContainer" containerID="96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db" Apr 24 21:47:50.678817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.678800 2573 scope.go:117] "RemoveContainer" containerID="9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46" Apr 24 21:47:50.688409 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.688328 2573 scope.go:117] "RemoveContainer" containerID="96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db" Apr 24 21:47:50.688973 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:50.688947 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db\": container with ID starting with 96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db not found: ID does not exist" containerID="96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db" Apr 24 21:47:50.689084 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.688985 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db"} err="failed to get container status \"96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db\": rpc error: code = NotFound desc = could not find container \"96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db\": container with ID starting with 96de5b94a40770b832708d8e4d76bfa10ba65cb4fc3499681296e572067599db not found: ID does not exist" Apr 24 21:47:50.689084 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.689010 2573 scope.go:117] "RemoveContainer" containerID="9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46" Apr 24 21:47:50.689464 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:50.689418 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46\": container with ID starting with 9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46 not found: ID does not exist" containerID="9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46" Apr 24 21:47:50.689464 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.689453 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46"} err="failed to get container status \"9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46\": rpc error: code = NotFound desc = could not find container \"9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46\": container with ID starting with 9fd2ecc7b452048733254cd2de3064f290f519d0a9933e8e862dbdab00fb8d46 not found: ID does not exist" Apr 24 21:47:50.691360 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.691339 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb"] Apr 24 21:47:50.694697 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:50.694675 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb"] Apr 24 21:47:51.368529 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:51.368493 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" path="/var/lib/kubelet/pods/c57b7cd6-ea5d-4873-860c-1506f1b5eab9/volumes" Apr 24 21:47:51.368955 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:51.368941 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" path="/var/lib/kubelet/pods/e6b72a15-e99f-4e4c-98c3-6e87f0b7458d/volumes" Apr 24 21:47:54.667029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:54.666996 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:47:54.667425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:54.667259 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:47:54.667512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:54.667487 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:47:54.667953 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:54.667905 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:48:04.667531 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:04.667494 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:48:04.667947 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:04.667887 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:48:14.667533 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:14.667486 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:48:14.667962 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:14.667937 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:48:24.667516 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:24.667477 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:48:24.668013 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:24.667878 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:48:25.974337 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:25.974302 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6"] Apr 24 21:48:25.974787 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:25.974597 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" containerID="cri-o://4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80" gracePeriod=30 Apr 24 21:48:25.974787 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:25.974637 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kube-rbac-proxy" containerID="cri-o://950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986" gracePeriod=30 Apr 24 21:48:26.017788 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.017758 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7"] Apr 24 21:48:26.018198 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018183 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kube-rbac-proxy" Apr 24 21:48:26.018252 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018201 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kube-rbac-proxy" Apr 24 21:48:26.018252 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018219 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" Apr 24 21:48:26.018252 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018224 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" Apr 24 21:48:26.018252 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018233 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kube-rbac-proxy" Apr 24 21:48:26.018252 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018239 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kube-rbac-proxy" Apr 24 21:48:26.018252 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018250 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" Apr 24 21:48:26.018432 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018256 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" Apr 24 21:48:26.018432 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018352 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kube-rbac-proxy" Apr 24 21:48:26.018432 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018365 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kserve-container" Apr 24 21:48:26.018432 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018373 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c57b7cd6-ea5d-4873-860c-1506f1b5eab9" containerName="kserve-container" Apr 24 21:48:26.018432 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.018380 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6b72a15-e99f-4e4c-98c3-6e87f0b7458d" containerName="kube-rbac-proxy" Apr 24 21:48:26.021668 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.021652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.024017 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.023980 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b9878-predictor-serving-cert\"" Apr 24 21:48:26.024151 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.024030 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b9878-kube-rbac-proxy-sar-config\"" Apr 24 21:48:26.030425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.030331 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7"] Apr 24 21:48:26.056979 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.056950 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n"] Apr 24 21:48:26.057248 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.057208 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" containerID="cri-o://b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d" gracePeriod=30 Apr 24 21:48:26.057390 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.057251 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kube-rbac-proxy" containerID="cri-o://7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d" gracePeriod=30 Apr 24 21:48:26.063651 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.063628 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8gwk\" (UniqueName: \"kubernetes.io/projected/95c6330c-680a-4894-ab45-4566cb30ef16-kube-api-access-w8gwk\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.063758 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.063691 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95c6330c-680a-4894-ab45-4566cb30ef16-proxy-tls\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.063815 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.063760 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95c6330c-680a-4894-ab45-4566cb30ef16-success-200-isvc-b9878-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.129447 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.126252 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb"] Apr 24 21:48:26.134902 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.134078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.138033 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.137362 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb"] Apr 24 21:48:26.138327 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.138303 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b9878-predictor-serving-cert\"" Apr 24 21:48:26.138425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.138325 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b9878-kube-rbac-proxy-sar-config\"" Apr 24 21:48:26.165150 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.165118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95c6330c-680a-4894-ab45-4566cb30ef16-success-200-isvc-b9878-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.165278 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.165183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-error-404-isvc-b9878-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b9878-predictor-65869744dd-gsdxb\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.165278 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.165232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8gwk\" (UniqueName: \"kubernetes.io/projected/95c6330c-680a-4894-ab45-4566cb30ef16-kube-api-access-w8gwk\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.165427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.165307 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-proxy-tls\") pod \"error-404-isvc-b9878-predictor-65869744dd-gsdxb\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.165427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.165373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95c6330c-680a-4894-ab45-4566cb30ef16-proxy-tls\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.165427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.165405 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bcxw\" (UniqueName: \"kubernetes.io/projected/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-kube-api-access-4bcxw\") pod \"error-404-isvc-b9878-predictor-65869744dd-gsdxb\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.165589 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:26.165505 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-b9878-predictor-serving-cert: secret "success-200-isvc-b9878-predictor-serving-cert" not found Apr 24 21:48:26.165589 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:26.165578 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95c6330c-680a-4894-ab45-4566cb30ef16-proxy-tls podName:95c6330c-680a-4894-ab45-4566cb30ef16 nodeName:}" failed. No retries permitted until 2026-04-24 21:48:26.66555788 +0000 UTC m=+1291.945405645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/95c6330c-680a-4894-ab45-4566cb30ef16-proxy-tls") pod "success-200-isvc-b9878-predictor-fb4f998b7-crfr7" (UID: "95c6330c-680a-4894-ab45-4566cb30ef16") : secret "success-200-isvc-b9878-predictor-serving-cert" not found Apr 24 21:48:26.165741 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.165716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95c6330c-680a-4894-ab45-4566cb30ef16-success-200-isvc-b9878-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.175814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.175795 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8gwk\" (UniqueName: \"kubernetes.io/projected/95c6330c-680a-4894-ab45-4566cb30ef16-kube-api-access-w8gwk\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.267046 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.267008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bcxw\" (UniqueName: \"kubernetes.io/projected/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-kube-api-access-4bcxw\") pod \"error-404-isvc-b9878-predictor-65869744dd-gsdxb\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.267241 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.267104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-error-404-isvc-b9878-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b9878-predictor-65869744dd-gsdxb\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.267241 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.267178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-proxy-tls\") pod \"error-404-isvc-b9878-predictor-65869744dd-gsdxb\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.267727 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.267704 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-error-404-isvc-b9878-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b9878-predictor-65869744dd-gsdxb\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.269666 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.269642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-proxy-tls\") pod \"error-404-isvc-b9878-predictor-65869744dd-gsdxb\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.275242 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.275217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bcxw\" (UniqueName: \"kubernetes.io/projected/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-kube-api-access-4bcxw\") pod \"error-404-isvc-b9878-predictor-65869744dd-gsdxb\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.450793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.450755 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.585975 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.585941 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb"] Apr 24 21:48:26.589535 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:48:26.589504 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a2cafa0_f418_40ed_a22a_f8abd1a4016b.slice/crio-82be8e1f0a28d5fa13a5caf677ddd998a1408fb9c0ea0732df39a2c97acfdacd WatchSource:0}: Error finding container 82be8e1f0a28d5fa13a5caf677ddd998a1408fb9c0ea0732df39a2c97acfdacd: Status 404 returned error can't find the container with id 82be8e1f0a28d5fa13a5caf677ddd998a1408fb9c0ea0732df39a2c97acfdacd Apr 24 21:48:26.670477 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.670453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95c6330c-680a-4894-ab45-4566cb30ef16-proxy-tls\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.672613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.672590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95c6330c-680a-4894-ab45-4566cb30ef16-proxy-tls\") pod \"success-200-isvc-b9878-predictor-fb4f998b7-crfr7\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:26.803571 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.803462 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" event={"ID":"9a2cafa0-f418-40ed-a22a-f8abd1a4016b","Type":"ContainerStarted","Data":"d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5"} Apr 24 21:48:26.803571 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.803510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" event={"ID":"9a2cafa0-f418-40ed-a22a-f8abd1a4016b","Type":"ContainerStarted","Data":"9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f"} Apr 24 21:48:26.803571 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.803525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" event={"ID":"9a2cafa0-f418-40ed-a22a-f8abd1a4016b","Type":"ContainerStarted","Data":"82be8e1f0a28d5fa13a5caf677ddd998a1408fb9c0ea0732df39a2c97acfdacd"} Apr 24 21:48:26.803884 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.803651 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:26.805136 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.805113 2573 generic.go:358] "Generic (PLEG): container finished" podID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerID="950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986" exitCode=2 Apr 24 21:48:26.805250 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.805169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" event={"ID":"6a382bf3-3608-40fd-ab84-8604e941a8c3","Type":"ContainerDied","Data":"950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986"} Apr 24 21:48:26.806624 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.806599 2573 generic.go:358] "Generic (PLEG): container finished" podID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerID="7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d" exitCode=2 Apr 24 21:48:26.806745 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.806673 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" event={"ID":"aa940d38-3c6c-4e99-8e9b-95aa2b971edb","Type":"ContainerDied","Data":"7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d"} Apr 24 21:48:26.828722 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.828679 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podStartSLOduration=0.828668113 podStartE2EDuration="828.668113ms" podCreationTimestamp="2026-04-24 21:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:26.82653185 +0000 UTC m=+1292.106379630" watchObservedRunningTime="2026-04-24 21:48:26.828668113 +0000 UTC m=+1292.108515971" Apr 24 21:48:26.935752 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:26.935719 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:27.070270 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.070245 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7"] Apr 24 21:48:27.072084 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:48:27.072044 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c6330c_680a_4894_ab45_4566cb30ef16.slice/crio-b0cea1e865f3220338586b49158b4a0e617990272d621693a3a4ed9ee5a276ac WatchSource:0}: Error finding container b0cea1e865f3220338586b49158b4a0e617990272d621693a3a4ed9ee5a276ac: Status 404 returned error can't find the container with id b0cea1e865f3220338586b49158b4a0e617990272d621693a3a4ed9ee5a276ac Apr 24 21:48:27.812095 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.812056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" event={"ID":"95c6330c-680a-4894-ab45-4566cb30ef16","Type":"ContainerStarted","Data":"c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b"} Apr 24 21:48:27.812095 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.812098 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" event={"ID":"95c6330c-680a-4894-ab45-4566cb30ef16","Type":"ContainerStarted","Data":"14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745"} Apr 24 21:48:27.812346 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.812112 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" event={"ID":"95c6330c-680a-4894-ab45-4566cb30ef16","Type":"ContainerStarted","Data":"b0cea1e865f3220338586b49158b4a0e617990272d621693a3a4ed9ee5a276ac"} Apr 24 21:48:27.812346 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.812161 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:27.812445 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.812427 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:27.813775 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.813753 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:48:27.832086 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.832046 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podStartSLOduration=2.832033504 podStartE2EDuration="2.832033504s" podCreationTimestamp="2026-04-24 21:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:27.829968797 +0000 UTC m=+1293.109816577" watchObservedRunningTime="2026-04-24 21:48:27.832033504 +0000 UTC m=+1293.111881261" Apr 24 21:48:27.883630 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.883577 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 21:48:27.888947 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:27.888881 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:48:28.815649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:28.815608 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:48:28.816117 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:28.815721 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:28.816845 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:28.816822 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:48:29.819139 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:29.819095 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:48:29.890841 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:29.890802 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.41:8643/healthz\": dial tcp 10.134.0.41:8643: connect: connection refused" Apr 24 21:48:29.894666 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:29.894643 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:48:30.041413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.041356 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:48:30.103508 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.103473 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zlc4\" (UniqueName: \"kubernetes.io/projected/6a382bf3-3608-40fd-ab84-8604e941a8c3-kube-api-access-4zlc4\") pod \"6a382bf3-3608-40fd-ab84-8604e941a8c3\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " Apr 24 21:48:30.103508 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.103512 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a382bf3-3608-40fd-ab84-8604e941a8c3-proxy-tls\") pod \"6a382bf3-3608-40fd-ab84-8604e941a8c3\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " Apr 24 21:48:30.103702 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.103542 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a382bf3-3608-40fd-ab84-8604e941a8c3-success-200-isvc-7fc99-kube-rbac-proxy-sar-config\") pod \"6a382bf3-3608-40fd-ab84-8604e941a8c3\" (UID: \"6a382bf3-3608-40fd-ab84-8604e941a8c3\") " Apr 24 21:48:30.103966 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.103907 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a382bf3-3608-40fd-ab84-8604e941a8c3-success-200-isvc-7fc99-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-7fc99-kube-rbac-proxy-sar-config") pod "6a382bf3-3608-40fd-ab84-8604e941a8c3" (UID: "6a382bf3-3608-40fd-ab84-8604e941a8c3"). InnerVolumeSpecName "success-200-isvc-7fc99-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:30.105482 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.105458 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a382bf3-3608-40fd-ab84-8604e941a8c3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6a382bf3-3608-40fd-ab84-8604e941a8c3" (UID: "6a382bf3-3608-40fd-ab84-8604e941a8c3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:30.105604 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.105584 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a382bf3-3608-40fd-ab84-8604e941a8c3-kube-api-access-4zlc4" (OuterVolumeSpecName: "kube-api-access-4zlc4") pod "6a382bf3-3608-40fd-ab84-8604e941a8c3" (UID: "6a382bf3-3608-40fd-ab84-8604e941a8c3"). InnerVolumeSpecName "kube-api-access-4zlc4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:30.204488 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.204454 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zlc4\" (UniqueName: \"kubernetes.io/projected/6a382bf3-3608-40fd-ab84-8604e941a8c3-kube-api-access-4zlc4\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:48:30.204488 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.204480 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a382bf3-3608-40fd-ab84-8604e941a8c3-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:48:30.204488 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.204492 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a382bf3-3608-40fd-ab84-8604e941a8c3-success-200-isvc-7fc99-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:48:30.506942 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.506890 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:48:30.607502 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.607470 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-error-404-isvc-7fc99-kube-rbac-proxy-sar-config\") pod \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " Apr 24 21:48:30.607685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.607531 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2cws\" (UniqueName: \"kubernetes.io/projected/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-kube-api-access-g2cws\") pod \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " Apr 24 21:48:30.607685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.607598 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-proxy-tls\") pod \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\" (UID: \"aa940d38-3c6c-4e99-8e9b-95aa2b971edb\") " Apr 24 21:48:30.607884 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.607851 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-error-404-isvc-7fc99-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-7fc99-kube-rbac-proxy-sar-config") pod "aa940d38-3c6c-4e99-8e9b-95aa2b971edb" (UID: "aa940d38-3c6c-4e99-8e9b-95aa2b971edb"). InnerVolumeSpecName "error-404-isvc-7fc99-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:30.609527 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.609505 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-kube-api-access-g2cws" (OuterVolumeSpecName: "kube-api-access-g2cws") pod "aa940d38-3c6c-4e99-8e9b-95aa2b971edb" (UID: "aa940d38-3c6c-4e99-8e9b-95aa2b971edb"). InnerVolumeSpecName "kube-api-access-g2cws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:30.609602 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.609511 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "aa940d38-3c6c-4e99-8e9b-95aa2b971edb" (UID: "aa940d38-3c6c-4e99-8e9b-95aa2b971edb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:30.708589 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.708482 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2cws\" (UniqueName: \"kubernetes.io/projected/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-kube-api-access-g2cws\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:48:30.708589 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.708529 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:48:30.708589 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.708544 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-7fc99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa940d38-3c6c-4e99-8e9b-95aa2b971edb-error-404-isvc-7fc99-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:48:30.823778 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.823745 2573 generic.go:358] "Generic (PLEG): container finished" podID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerID="4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80" exitCode=0 Apr 24 21:48:30.824215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.823809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" event={"ID":"6a382bf3-3608-40fd-ab84-8604e941a8c3","Type":"ContainerDied","Data":"4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80"} Apr 24 21:48:30.824215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.823817 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" Apr 24 21:48:30.824215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.823843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6" event={"ID":"6a382bf3-3608-40fd-ab84-8604e941a8c3","Type":"ContainerDied","Data":"7ed1a9d6f7e537dce689669e60d2debb9f161945e1515fe6b011fb470898d6a7"} Apr 24 21:48:30.824215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.823864 2573 scope.go:117] "RemoveContainer" containerID="950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986" Apr 24 21:48:30.825276 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.825246 2573 generic.go:358] "Generic (PLEG): container finished" podID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerID="b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d" exitCode=0 Apr 24 21:48:30.825391 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.825326 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" Apr 24 21:48:30.825391 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.825338 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" event={"ID":"aa940d38-3c6c-4e99-8e9b-95aa2b971edb","Type":"ContainerDied","Data":"b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d"} Apr 24 21:48:30.825391 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.825373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n" event={"ID":"aa940d38-3c6c-4e99-8e9b-95aa2b971edb","Type":"ContainerDied","Data":"11e6e0c24da90a2e004817d3e574b0d02295ba9d99a8a740ad013ee028aa643f"} Apr 24 21:48:30.842244 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.842212 2573 scope.go:117] "RemoveContainer" containerID="4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80" Apr 24 21:48:30.852515 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.852488 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6"] Apr 24 21:48:30.854789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.854765 2573 scope.go:117] "RemoveContainer" containerID="950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986" Apr 24 21:48:30.855166 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:30.855134 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986\": container with ID starting with 950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986 not found: ID does not exist" containerID="950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986" Apr 24 21:48:30.855233 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.855178 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986"} err="failed to get container status \"950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986\": rpc error: code = NotFound desc = could not find container \"950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986\": container with ID starting with 950c88b2b7c0470a51860159c29cfb6e8979d9764bb24fd84a8cd8f093954986 not found: ID does not exist" Apr 24 21:48:30.855233 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.855203 2573 scope.go:117] "RemoveContainer" containerID="4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80" Apr 24 21:48:30.855459 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.855439 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6"] Apr 24 21:48:30.855502 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:30.855445 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80\": container with ID starting with 4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80 not found: ID does not exist" containerID="4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80" Apr 24 21:48:30.855538 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.855493 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80"} err="failed to get container status \"4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80\": rpc error: code = NotFound desc = could not find container \"4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80\": container with ID starting with 4bfc24437c4376c81731acbbdb85305515edf7f675f4bda189914a469a185a80 not found: ID does not exist" Apr 24 21:48:30.855538 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.855516 2573 scope.go:117] "RemoveContainer" containerID="7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d" Apr 24 21:48:30.863058 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.863038 2573 scope.go:117] "RemoveContainer" containerID="b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d" Apr 24 21:48:30.866642 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.866596 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n"] Apr 24 21:48:30.868649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.868624 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n"] Apr 24 21:48:30.872930 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.872898 2573 scope.go:117] "RemoveContainer" containerID="7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d" Apr 24 21:48:30.873294 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:30.873275 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d\": container with ID starting with 7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d not found: ID does not exist" containerID="7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d" Apr 24 21:48:30.873344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.873303 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d"} err="failed to get container status \"7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d\": rpc error: code = NotFound desc = could not find container \"7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d\": container with ID starting with 7899116f6ec0c947f89dad89825f023c270bba85988c7ca63850a14b414c3b4d not found: ID does not exist" Apr 24 21:48:30.873344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.873322 2573 scope.go:117] "RemoveContainer" containerID="b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d" Apr 24 21:48:30.873589 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:30.873569 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d\": container with ID starting with b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d not found: ID does not exist" containerID="b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d" Apr 24 21:48:30.873646 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:30.873594 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d"} err="failed to get container status \"b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d\": rpc error: code = NotFound desc = could not find container \"b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d\": container with ID starting with b074d16f40dccdfb5761a57e98e88af3da1583daef1475adf81af9bcc62ade2d not found: ID does not exist" Apr 24 21:48:31.368555 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:31.368524 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" path="/var/lib/kubelet/pods/6a382bf3-3608-40fd-ab84-8604e941a8c3/volumes" Apr 24 21:48:31.368986 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:31.368972 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" path="/var/lib/kubelet/pods/aa940d38-3c6c-4e99-8e9b-95aa2b971edb/volumes" Apr 24 21:48:33.820191 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:33.820164 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:48:33.820789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:33.820761 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:48:34.668064 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:34.668033 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:48:34.668791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:34.668775 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:48:34.823504 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:34.823471 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:48:34.824048 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:34.824018 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:48:43.820750 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:43.820699 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:48:44.824041 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:44.824001 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:48:53.820720 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:53.820675 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:48:54.823969 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:54.823906 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:48:56.464094 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.464059 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq"] Apr 24 21:48:56.464479 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.464410 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" containerID="cri-o://799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c" gracePeriod=30 Apr 24 21:48:56.464479 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.464442 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kube-rbac-proxy" containerID="cri-o://cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f" gracePeriod=30 Apr 24 21:48:56.520751 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.520711 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq"] Apr 24 21:48:56.521366 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521345 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kube-rbac-proxy" Apr 24 21:48:56.521366 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521366 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kube-rbac-proxy" Apr 24 21:48:56.521509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521392 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" Apr 24 21:48:56.521509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521401 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" Apr 24 21:48:56.521509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521417 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kube-rbac-proxy" Apr 24 21:48:56.521509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521426 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kube-rbac-proxy" Apr 24 21:48:56.521509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521449 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" Apr 24 21:48:56.521509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521457 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" Apr 24 21:48:56.521724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521536 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kube-rbac-proxy" Apr 24 21:48:56.521724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521550 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kserve-container" Apr 24 21:48:56.521724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521564 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a382bf3-3608-40fd-ab84-8604e941a8c3" containerName="kserve-container" Apr 24 21:48:56.521724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.521576 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa940d38-3c6c-4e99-8e9b-95aa2b971edb" containerName="kube-rbac-proxy" Apr 24 21:48:56.525551 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.525526 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:56.528065 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.528043 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1dabb-kube-rbac-proxy-sar-config\"" Apr 24 21:48:56.528349 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.528316 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1dabb-predictor-serving-cert\"" Apr 24 21:48:56.535324 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.535301 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq"] Apr 24 21:48:56.543705 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.543682 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277"] Apr 24 21:48:56.543962 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.543940 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" containerID="cri-o://5ad4556e3e9d3f68efb89cc32b57f8fce569a523efdc621ba3103a2dbf9bad02" gracePeriod=30 Apr 24 21:48:56.544025 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.543999 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kube-rbac-proxy" containerID="cri-o://d7182e538620ee83c0d735dc9e1c27b12429a38b861d053e32369b731cddf2bf" gracePeriod=30 Apr 24 21:48:56.632635 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.632594 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx"] Apr 24 21:48:56.636722 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.636696 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.639303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.639281 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1dabb-predictor-serving-cert\"" Apr 24 21:48:56.639413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.639392 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7drvt\" (UniqueName: \"kubernetes.io/projected/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-kube-api-access-7drvt\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:56.639484 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.639449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-success-200-isvc-1dabb-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:56.639535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.639507 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1dabb-kube-rbac-proxy-sar-config\"" Apr 24 21:48:56.639585 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.639540 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-proxy-tls\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:56.646839 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.646818 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx"] Apr 24 21:48:56.740880 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.740798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ggpj\" (UniqueName: \"kubernetes.io/projected/ea3669b0-a0cd-418f-9778-36c1b7855246-kube-api-access-2ggpj\") pod \"error-404-isvc-1dabb-predictor-64d84476b8-kbgzx\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.741066 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.740907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7drvt\" (UniqueName: \"kubernetes.io/projected/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-kube-api-access-7drvt\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:56.741066 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.740976 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-success-200-isvc-1dabb-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:56.741066 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.741008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-proxy-tls\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:56.741066 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.741051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3669b0-a0cd-418f-9778-36c1b7855246-error-404-isvc-1dabb-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1dabb-predictor-64d84476b8-kbgzx\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.741286 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:56.741145 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-serving-cert: secret "success-200-isvc-1dabb-predictor-serving-cert" not found Apr 24 21:48:56.741286 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.741180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3669b0-a0cd-418f-9778-36c1b7855246-proxy-tls\") pod \"error-404-isvc-1dabb-predictor-64d84476b8-kbgzx\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.741286 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:56.741209 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-proxy-tls podName:eda051e5-7bbf-41a9-ad1b-c9d1e5368798 nodeName:}" failed. No retries permitted until 2026-04-24 21:48:57.241190275 +0000 UTC m=+1322.521038038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-proxy-tls") pod "success-200-isvc-1dabb-predictor-68d6866787-m6hgq" (UID: "eda051e5-7bbf-41a9-ad1b-c9d1e5368798") : secret "success-200-isvc-1dabb-predictor-serving-cert" not found Apr 24 21:48:56.741704 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.741680 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-success-200-isvc-1dabb-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:56.756509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.756455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7drvt\" (UniqueName: \"kubernetes.io/projected/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-kube-api-access-7drvt\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:56.842361 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.842327 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ggpj\" (UniqueName: \"kubernetes.io/projected/ea3669b0-a0cd-418f-9778-36c1b7855246-kube-api-access-2ggpj\") pod \"error-404-isvc-1dabb-predictor-64d84476b8-kbgzx\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.842520 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.842409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3669b0-a0cd-418f-9778-36c1b7855246-error-404-isvc-1dabb-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1dabb-predictor-64d84476b8-kbgzx\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.842520 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.842440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3669b0-a0cd-418f-9778-36c1b7855246-proxy-tls\") pod \"error-404-isvc-1dabb-predictor-64d84476b8-kbgzx\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.843283 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.843251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3669b0-a0cd-418f-9778-36c1b7855246-error-404-isvc-1dabb-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1dabb-predictor-64d84476b8-kbgzx\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.844898 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.844876 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3669b0-a0cd-418f-9778-36c1b7855246-proxy-tls\") pod \"error-404-isvc-1dabb-predictor-64d84476b8-kbgzx\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.850895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.850867 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ggpj\" (UniqueName: \"kubernetes.io/projected/ea3669b0-a0cd-418f-9778-36c1b7855246-kube-api-access-2ggpj\") pod \"error-404-isvc-1dabb-predictor-64d84476b8-kbgzx\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:56.920372 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.920340 2573 generic.go:358] "Generic (PLEG): container finished" podID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerID="d7182e538620ee83c0d735dc9e1c27b12429a38b861d053e32369b731cddf2bf" exitCode=2 Apr 24 21:48:56.920538 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.920359 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" event={"ID":"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0","Type":"ContainerDied","Data":"d7182e538620ee83c0d735dc9e1c27b12429a38b861d053e32369b731cddf2bf"} Apr 24 21:48:56.921897 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.921876 2573 generic.go:358] "Generic (PLEG): container finished" podID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerID="cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f" exitCode=2 Apr 24 21:48:56.922007 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.921946 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" event={"ID":"550706c4-48b4-456b-bd40-364ec1b3d86d","Type":"ContainerDied","Data":"cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f"} Apr 24 21:48:56.949388 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:56.949364 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:57.245579 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.245546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-proxy-tls\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:57.247943 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.247906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-proxy-tls\") pod \"success-200-isvc-1dabb-predictor-68d6866787-m6hgq\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:57.296223 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.296191 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx"] Apr 24 21:48:57.299364 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:48:57.299339 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea3669b0_a0cd_418f_9778_36c1b7855246.slice/crio-59c862b99d7f33ae99fd4ba63f80ded169c83045be5223daccc662c11e51b88c WatchSource:0}: Error finding container 59c862b99d7f33ae99fd4ba63f80ded169c83045be5223daccc662c11e51b88c: Status 404 returned error can't find the container with id 59c862b99d7f33ae99fd4ba63f80ded169c83045be5223daccc662c11e51b88c Apr 24 21:48:57.440190 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.440157 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:57.573811 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.573783 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq"] Apr 24 21:48:57.576115 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:48:57.576087 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeda051e5_7bbf_41a9_ad1b_c9d1e5368798.slice/crio-da265a018bb05713c62f68c7b55819c398079c2c64b9c03cee0173a92d191e2b WatchSource:0}: Error finding container da265a018bb05713c62f68c7b55819c398079c2c64b9c03cee0173a92d191e2b: Status 404 returned error can't find the container with id da265a018bb05713c62f68c7b55819c398079c2c64b9c03cee0173a92d191e2b Apr 24 21:48:57.927187 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.927089 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" event={"ID":"eda051e5-7bbf-41a9-ad1b-c9d1e5368798","Type":"ContainerStarted","Data":"dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05"} Apr 24 21:48:57.927187 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.927141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" event={"ID":"eda051e5-7bbf-41a9-ad1b-c9d1e5368798","Type":"ContainerStarted","Data":"57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061"} Apr 24 21:48:57.927187 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.927157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" event={"ID":"eda051e5-7bbf-41a9-ad1b-c9d1e5368798","Type":"ContainerStarted","Data":"da265a018bb05713c62f68c7b55819c398079c2c64b9c03cee0173a92d191e2b"} Apr 24 21:48:57.927436 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.927240 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:57.928756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.928729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" event={"ID":"ea3669b0-a0cd-418f-9778-36c1b7855246","Type":"ContainerStarted","Data":"91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa"} Apr 24 21:48:57.928872 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.928761 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" event={"ID":"ea3669b0-a0cd-418f-9778-36c1b7855246","Type":"ContainerStarted","Data":"a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a"} Apr 24 21:48:57.928872 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.928775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" event={"ID":"ea3669b0-a0cd-418f-9778-36c1b7855246","Type":"ContainerStarted","Data":"59c862b99d7f33ae99fd4ba63f80ded169c83045be5223daccc662c11e51b88c"} Apr 24 21:48:57.928872 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.928793 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:57.928872 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.928806 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:48:57.930326 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.930300 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 21:48:57.949448 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.949403 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podStartSLOduration=1.9493865989999999 podStartE2EDuration="1.949386599s" podCreationTimestamp="2026-04-24 21:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:57.947587077 +0000 UTC m=+1323.227434880" watchObservedRunningTime="2026-04-24 21:48:57.949386599 +0000 UTC m=+1323.229234379" Apr 24 21:48:57.972455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:57.972416 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podStartSLOduration=1.972402846 podStartE2EDuration="1.972402846s" podCreationTimestamp="2026-04-24 21:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:57.971145131 +0000 UTC m=+1323.250992904" watchObservedRunningTime="2026-04-24 21:48:57.972402846 +0000 UTC m=+1323.252250626" Apr 24 21:48:58.932789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:58.932742 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 21:48:58.933227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:58.932844 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:48:58.934215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:58.934188 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 21:48:59.663077 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:59.663026 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 24 21:48:59.663227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:59.663026 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.43:8643/healthz\": dial tcp 10.134.0.43:8643: connect: connection refused" Apr 24 21:48:59.938398 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:59.938322 2573 generic.go:358] "Generic (PLEG): container finished" podID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerID="5ad4556e3e9d3f68efb89cc32b57f8fce569a523efdc621ba3103a2dbf9bad02" exitCode=0 Apr 24 21:48:59.938748 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:59.938397 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" event={"ID":"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0","Type":"ContainerDied","Data":"5ad4556e3e9d3f68efb89cc32b57f8fce569a523efdc621ba3103a2dbf9bad02"} Apr 24 21:48:59.938813 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:59.938784 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 21:49:00.000467 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.000436 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:49:00.171194 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.171160 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-proxy-tls\") pod \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " Apr 24 21:49:00.171363 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.171219 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-error-404-isvc-2525b-kube-rbac-proxy-sar-config\") pod \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " Apr 24 21:49:00.171363 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.171335 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkml8\" (UniqueName: \"kubernetes.io/projected/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-kube-api-access-hkml8\") pod \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\" (UID: \"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0\") " Apr 24 21:49:00.171600 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.171577 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-error-404-isvc-2525b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-2525b-kube-rbac-proxy-sar-config") pod "1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" (UID: "1dfe4c57-e300-4bce-88f5-2f3d90e17cc0"). InnerVolumeSpecName "error-404-isvc-2525b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:49:00.173334 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.173313 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-kube-api-access-hkml8" (OuterVolumeSpecName: "kube-api-access-hkml8") pod "1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" (UID: "1dfe4c57-e300-4bce-88f5-2f3d90e17cc0"). InnerVolumeSpecName "kube-api-access-hkml8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:00.173397 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.173365 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" (UID: "1dfe4c57-e300-4bce-88f5-2f3d90e17cc0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:00.272296 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.272263 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-error-404-isvc-2525b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:00.272296 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.272294 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hkml8\" (UniqueName: \"kubernetes.io/projected/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-kube-api-access-hkml8\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:00.272502 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.272308 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:00.508521 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.508497 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:49:00.675281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.675200 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwvvg\" (UniqueName: \"kubernetes.io/projected/550706c4-48b4-456b-bd40-364ec1b3d86d-kube-api-access-kwvvg\") pod \"550706c4-48b4-456b-bd40-364ec1b3d86d\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " Apr 24 21:49:00.675281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.675269 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/550706c4-48b4-456b-bd40-364ec1b3d86d-proxy-tls\") pod \"550706c4-48b4-456b-bd40-364ec1b3d86d\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " Apr 24 21:49:00.675471 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.675331 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/550706c4-48b4-456b-bd40-364ec1b3d86d-success-200-isvc-2525b-kube-rbac-proxy-sar-config\") pod \"550706c4-48b4-456b-bd40-364ec1b3d86d\" (UID: \"550706c4-48b4-456b-bd40-364ec1b3d86d\") " Apr 24 21:49:00.675707 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.675674 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/550706c4-48b4-456b-bd40-364ec1b3d86d-success-200-isvc-2525b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-2525b-kube-rbac-proxy-sar-config") pod "550706c4-48b4-456b-bd40-364ec1b3d86d" (UID: "550706c4-48b4-456b-bd40-364ec1b3d86d"). InnerVolumeSpecName "success-200-isvc-2525b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:49:00.677258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.677238 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/550706c4-48b4-456b-bd40-364ec1b3d86d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "550706c4-48b4-456b-bd40-364ec1b3d86d" (UID: "550706c4-48b4-456b-bd40-364ec1b3d86d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:00.677329 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.677252 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550706c4-48b4-456b-bd40-364ec1b3d86d-kube-api-access-kwvvg" (OuterVolumeSpecName: "kube-api-access-kwvvg") pod "550706c4-48b4-456b-bd40-364ec1b3d86d" (UID: "550706c4-48b4-456b-bd40-364ec1b3d86d"). InnerVolumeSpecName "kube-api-access-kwvvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:00.776671 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.776628 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/550706c4-48b4-456b-bd40-364ec1b3d86d-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:00.776671 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.776667 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-2525b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/550706c4-48b4-456b-bd40-364ec1b3d86d-success-200-isvc-2525b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:00.776671 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.776678 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kwvvg\" (UniqueName: \"kubernetes.io/projected/550706c4-48b4-456b-bd40-364ec1b3d86d-kube-api-access-kwvvg\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:00.942989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.942877 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" event={"ID":"1dfe4c57-e300-4bce-88f5-2f3d90e17cc0","Type":"ContainerDied","Data":"bc07db888951388d49a4ec1bd7354cec888ef8d299015f6da7777584933d5d8c"} Apr 24 21:49:00.942989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.942887 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277" Apr 24 21:49:00.942989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.942950 2573 scope.go:117] "RemoveContainer" containerID="d7182e538620ee83c0d735dc9e1c27b12429a38b861d053e32369b731cddf2bf" Apr 24 21:49:00.944511 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.944485 2573 generic.go:358] "Generic (PLEG): container finished" podID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerID="799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c" exitCode=0 Apr 24 21:49:00.944610 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.944533 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" event={"ID":"550706c4-48b4-456b-bd40-364ec1b3d86d","Type":"ContainerDied","Data":"799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c"} Apr 24 21:49:00.944610 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.944551 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" Apr 24 21:49:00.944610 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.944558 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq" event={"ID":"550706c4-48b4-456b-bd40-364ec1b3d86d","Type":"ContainerDied","Data":"bb0f4f00b5dc15e40b2bb57047183349ce13473e717362a8df81af9763c552c3"} Apr 24 21:49:00.951599 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.951574 2573 scope.go:117] "RemoveContainer" containerID="5ad4556e3e9d3f68efb89cc32b57f8fce569a523efdc621ba3103a2dbf9bad02" Apr 24 21:49:00.959444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.959429 2573 scope.go:117] "RemoveContainer" containerID="cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f" Apr 24 21:49:00.966788 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.966772 2573 scope.go:117] "RemoveContainer" containerID="799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c" Apr 24 21:49:00.974184 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.974159 2573 scope.go:117] "RemoveContainer" containerID="cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f" Apr 24 21:49:00.974540 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:49:00.974504 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f\": container with ID starting with cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f not found: ID does not exist" containerID="cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f" Apr 24 21:49:00.974616 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.974541 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f"} err="failed to get container status \"cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f\": rpc error: code = NotFound desc = could not find container \"cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f\": container with ID starting with cfa8a196fd0b3a67b5db1f4abc43040aac9e648c8f612d7279daeac64017875f not found: ID does not exist" Apr 24 21:49:00.974616 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.974566 2573 scope.go:117] "RemoveContainer" containerID="799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c" Apr 24 21:49:00.974899 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:49:00.974863 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c\": container with ID starting with 799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c not found: ID does not exist" containerID="799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c" Apr 24 21:49:00.975020 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.974904 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c"} err="failed to get container status \"799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c\": rpc error: code = NotFound desc = could not find container \"799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c\": container with ID starting with 799512c106f041b31482b97e854f3a070452a64bb4014d35b92286a6dc9c815c not found: ID does not exist" Apr 24 21:49:00.977408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.977386 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq"] Apr 24 21:49:00.983566 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:00.983537 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq"] Apr 24 21:49:01.009746 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:01.009717 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277"] Apr 24 21:49:01.019103 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:01.019075 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277"] Apr 24 21:49:01.368814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:01.368781 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" path="/var/lib/kubelet/pods/1dfe4c57-e300-4bce-88f5-2f3d90e17cc0/volumes" Apr 24 21:49:01.369256 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:01.369242 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" path="/var/lib/kubelet/pods/550706c4-48b4-456b-bd40-364ec1b3d86d/volumes" Apr 24 21:49:03.821433 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:03.821391 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 21:49:03.936757 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:03.936729 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:49:03.937279 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:03.937249 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 21:49:04.824466 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:04.824422 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 21:49:04.943665 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:04.943632 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:49:04.944251 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:04.944216 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 21:49:13.821546 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:13.821463 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:49:13.937354 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:13.937316 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 21:49:14.824783 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:14.824752 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:49:14.944156 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:14.944113 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 21:49:23.937487 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:23.937441 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 21:49:24.944893 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:24.944849 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 21:49:33.937737 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:33.937697 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 21:49:34.945189 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:34.945147 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 21:49:36.158023 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.157987 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb"] Apr 24 21:49:36.158488 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.158388 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" containerID="cri-o://9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f" gracePeriod=30 Apr 24 21:49:36.158488 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.158427 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kube-rbac-proxy" containerID="cri-o://d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5" gracePeriod=30 Apr 24 21:49:36.218681 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.218637 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7"] Apr 24 21:49:36.219039 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.218963 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" containerID="cri-o://14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745" gracePeriod=30 Apr 24 21:49:36.219221 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.218991 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kube-rbac-proxy" containerID="cri-o://c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b" gracePeriod=30 Apr 24 21:49:36.235346 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235318 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms"] Apr 24 21:49:36.235704 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235691 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" Apr 24 21:49:36.235704 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235705 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" Apr 24 21:49:36.235808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235717 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kube-rbac-proxy" Apr 24 21:49:36.235808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235723 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kube-rbac-proxy" Apr 24 21:49:36.235808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235738 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" Apr 24 21:49:36.235808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235743 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" Apr 24 21:49:36.235808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235754 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kube-rbac-proxy" Apr 24 21:49:36.235808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235759 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kube-rbac-proxy" Apr 24 21:49:36.236078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235814 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kube-rbac-proxy" Apr 24 21:49:36.236078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235827 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1dfe4c57-e300-4bce-88f5-2f3d90e17cc0" containerName="kserve-container" Apr 24 21:49:36.236078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235837 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kube-rbac-proxy" Apr 24 21:49:36.236078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.235846 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="550706c4-48b4-456b-bd40-364ec1b3d86d" containerName="kserve-container" Apr 24 21:49:36.240833 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.240796 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.242892 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.242872 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1ec3f-predictor-serving-cert\"" Apr 24 21:49:36.243035 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.243004 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\"" Apr 24 21:49:36.247333 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.247311 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms"] Apr 24 21:49:36.328803 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.328773 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx"] Apr 24 21:49:36.332431 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.332407 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:36.335632 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.335610 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\"" Apr 24 21:49:36.335749 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.335610 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1ec3f-predictor-serving-cert\"" Apr 24 21:49:36.341632 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.341596 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx"] Apr 24 21:49:36.414002 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.409066 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59803781-86f5-4c54-91c2-18922795d983-proxy-tls\") pod \"success-200-isvc-1ec3f-predictor-677b5997f5-brmms\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.414002 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.409140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4csml\" (UniqueName: \"kubernetes.io/projected/59803781-86f5-4c54-91c2-18922795d983-kube-api-access-4csml\") pod \"success-200-isvc-1ec3f-predictor-677b5997f5-brmms\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.414002 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.409243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59803781-86f5-4c54-91c2-18922795d983-success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1ec3f-predictor-677b5997f5-brmms\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.510138 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.510099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-proxy-tls\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:36.510138 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.510145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59803781-86f5-4c54-91c2-18922795d983-success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1ec3f-predictor-677b5997f5-brmms\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.510408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.510256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59803781-86f5-4c54-91c2-18922795d983-proxy-tls\") pod \"success-200-isvc-1ec3f-predictor-677b5997f5-brmms\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.510408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.510325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4csml\" (UniqueName: \"kubernetes.io/projected/59803781-86f5-4c54-91c2-18922795d983-kube-api-access-4csml\") pod \"success-200-isvc-1ec3f-predictor-677b5997f5-brmms\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.510408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.510371 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvdqc\" (UniqueName: \"kubernetes.io/projected/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-kube-api-access-dvdqc\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:36.510570 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.510432 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:36.510866 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.510846 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59803781-86f5-4c54-91c2-18922795d983-success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1ec3f-predictor-677b5997f5-brmms\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.512797 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.512772 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59803781-86f5-4c54-91c2-18922795d983-proxy-tls\") pod \"success-200-isvc-1ec3f-predictor-677b5997f5-brmms\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.518186 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.518162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4csml\" (UniqueName: \"kubernetes.io/projected/59803781-86f5-4c54-91c2-18922795d983-kube-api-access-4csml\") pod \"success-200-isvc-1ec3f-predictor-677b5997f5-brmms\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.553179 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.553151 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:36.611087 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.611040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdqc\" (UniqueName: \"kubernetes.io/projected/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-kube-api-access-dvdqc\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:36.611281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.611125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:36.611281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.611180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-proxy-tls\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:36.611790 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:49:36.611428 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-serving-cert: secret "error-404-isvc-1ec3f-predictor-serving-cert" not found Apr 24 21:49:36.611790 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:49:36.611510 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-proxy-tls podName:22ee0cfe-06f5-47f0-99f6-a77f434afb7c nodeName:}" failed. No retries permitted until 2026-04-24 21:49:37.111486045 +0000 UTC m=+1362.391333806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-proxy-tls") pod "error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" (UID: "22ee0cfe-06f5-47f0-99f6-a77f434afb7c") : secret "error-404-isvc-1ec3f-predictor-serving-cert" not found Apr 24 21:49:36.612556 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.612501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:36.619874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.619849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvdqc\" (UniqueName: \"kubernetes.io/projected/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-kube-api-access-dvdqc\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:36.682684 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:36.682593 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms"] Apr 24 21:49:36.693703 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:49:36.693657 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59803781_86f5_4c54_91c2_18922795d983.slice/crio-0d3d4ef47b23648595aa5b47d55e474cc0074c0ac58d91fb1061d1db5a607cd2 WatchSource:0}: Error finding container 0d3d4ef47b23648595aa5b47d55e474cc0074c0ac58d91fb1061d1db5a607cd2: Status 404 returned error can't find the container with id 0d3d4ef47b23648595aa5b47d55e474cc0074c0ac58d91fb1061d1db5a607cd2 Apr 24 21:49:37.074371 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.074334 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" event={"ID":"59803781-86f5-4c54-91c2-18922795d983","Type":"ContainerStarted","Data":"d4df6428c0b3af2192ec3bff6978d8956f21067bd3a192561bf67e220b84bf08"} Apr 24 21:49:37.074371 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.074377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" event={"ID":"59803781-86f5-4c54-91c2-18922795d983","Type":"ContainerStarted","Data":"e92c1e72ed774b6f090042efe8008ff858a194ab31425464c0fbb80b47a5b092"} Apr 24 21:49:37.074607 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.074387 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" event={"ID":"59803781-86f5-4c54-91c2-18922795d983","Type":"ContainerStarted","Data":"0d3d4ef47b23648595aa5b47d55e474cc0074c0ac58d91fb1061d1db5a607cd2"} Apr 24 21:49:37.074607 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.074470 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:37.076077 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.076048 2573 generic.go:358] "Generic (PLEG): container finished" podID="95c6330c-680a-4894-ab45-4566cb30ef16" containerID="c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b" exitCode=2 Apr 24 21:49:37.076210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.076106 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" event={"ID":"95c6330c-680a-4894-ab45-4566cb30ef16","Type":"ContainerDied","Data":"c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b"} Apr 24 21:49:37.077539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.077516 2573 generic.go:358] "Generic (PLEG): container finished" podID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerID="d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5" exitCode=2 Apr 24 21:49:37.077638 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.077579 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" event={"ID":"9a2cafa0-f418-40ed-a22a-f8abd1a4016b","Type":"ContainerDied","Data":"d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5"} Apr 24 21:49:37.091023 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.090947 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podStartSLOduration=1.090930534 podStartE2EDuration="1.090930534s" podCreationTimestamp="2026-04-24 21:49:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:37.090307669 +0000 UTC m=+1362.370155449" watchObservedRunningTime="2026-04-24 21:49:37.090930534 +0000 UTC m=+1362.370778309" Apr 24 21:49:37.115867 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.115836 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-proxy-tls\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:37.118227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.118207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-proxy-tls\") pod \"error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:37.246623 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.246579 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:37.391163 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:37.391118 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx"] Apr 24 21:49:37.395632 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:49:37.395592 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ee0cfe_06f5_47f0_99f6_a77f434afb7c.slice/crio-fcba0c8add211164d12bdee750feb6485d3c54dc0af15ed082fc80d37cb5f113 WatchSource:0}: Error finding container fcba0c8add211164d12bdee750feb6485d3c54dc0af15ed082fc80d37cb5f113: Status 404 returned error can't find the container with id fcba0c8add211164d12bdee750feb6485d3c54dc0af15ed082fc80d37cb5f113 Apr 24 21:49:38.086001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.085961 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" event={"ID":"22ee0cfe-06f5-47f0-99f6-a77f434afb7c","Type":"ContainerStarted","Data":"ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5"} Apr 24 21:49:38.086202 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.086008 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" event={"ID":"22ee0cfe-06f5-47f0-99f6-a77f434afb7c","Type":"ContainerStarted","Data":"b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98"} Apr 24 21:49:38.086202 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.086023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" event={"ID":"22ee0cfe-06f5-47f0-99f6-a77f434afb7c","Type":"ContainerStarted","Data":"fcba0c8add211164d12bdee750feb6485d3c54dc0af15ed082fc80d37cb5f113"} Apr 24 21:49:38.086202 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.086142 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:38.087886 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.087114 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:49:38.087886 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.087756 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:38.087886 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.087789 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:38.088350 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.088320 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:49:38.105251 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.105201 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podStartSLOduration=2.105186797 podStartE2EDuration="2.105186797s" podCreationTimestamp="2026-04-24 21:49:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:38.104709231 +0000 UTC m=+1363.384557012" watchObservedRunningTime="2026-04-24 21:49:38.105186797 +0000 UTC m=+1363.385034579" Apr 24 21:49:38.815926 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:38.815874 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.45:8643/healthz\": dial tcp 10.134.0.45:8643: connect: connection refused" Apr 24 21:49:39.090814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:39.090717 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:49:39.090814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:39.090775 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:49:39.819388 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:39.819344 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.44:8643/healthz\": dial tcp 10.134.0.44:8643: connect: connection refused" Apr 24 21:49:40.001194 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.001171 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:49:40.022356 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.022331 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:49:40.095689 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.095595 2573 generic.go:358] "Generic (PLEG): container finished" podID="95c6330c-680a-4894-ab45-4566cb30ef16" containerID="14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745" exitCode=0 Apr 24 21:49:40.095689 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.095682 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" Apr 24 21:49:40.095905 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.095680 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" event={"ID":"95c6330c-680a-4894-ab45-4566cb30ef16","Type":"ContainerDied","Data":"14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745"} Apr 24 21:49:40.095905 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.095732 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7" event={"ID":"95c6330c-680a-4894-ab45-4566cb30ef16","Type":"ContainerDied","Data":"b0cea1e865f3220338586b49158b4a0e617990272d621693a3a4ed9ee5a276ac"} Apr 24 21:49:40.095905 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.095754 2573 scope.go:117] "RemoveContainer" containerID="c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b" Apr 24 21:49:40.097097 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.097075 2573 generic.go:358] "Generic (PLEG): container finished" podID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerID="9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f" exitCode=0 Apr 24 21:49:40.097206 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.097137 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" Apr 24 21:49:40.097206 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.097185 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" event={"ID":"9a2cafa0-f418-40ed-a22a-f8abd1a4016b","Type":"ContainerDied","Data":"9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f"} Apr 24 21:49:40.097313 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.097233 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb" event={"ID":"9a2cafa0-f418-40ed-a22a-f8abd1a4016b","Type":"ContainerDied","Data":"82be8e1f0a28d5fa13a5caf677ddd998a1408fb9c0ea0732df39a2c97acfdacd"} Apr 24 21:49:40.097712 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.097684 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:49:40.104114 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.104100 2573 scope.go:117] "RemoveContainer" containerID="14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745" Apr 24 21:49:40.111392 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.111372 2573 scope.go:117] "RemoveContainer" containerID="c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b" Apr 24 21:49:40.111626 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:49:40.111608 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b\": container with ID starting with c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b not found: ID does not exist" containerID="c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b" Apr 24 21:49:40.111672 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.111633 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b"} err="failed to get container status \"c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b\": rpc error: code = NotFound desc = could not find container \"c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b\": container with ID starting with c08c3bce28a8bac1b16489572cbf9908c1fbab9244b9dd3361a0df1d55fdfb9b not found: ID does not exist" Apr 24 21:49:40.111672 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.111648 2573 scope.go:117] "RemoveContainer" containerID="14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745" Apr 24 21:49:40.111878 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:49:40.111861 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745\": container with ID starting with 14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745 not found: ID does not exist" containerID="14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745" Apr 24 21:49:40.111929 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.111882 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745"} err="failed to get container status \"14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745\": rpc error: code = NotFound desc = could not find container \"14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745\": container with ID starting with 14933f0b49f4fa6760284786dd9a32047cf83270387317bd5917c1b73a154745 not found: ID does not exist" Apr 24 21:49:40.111929 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.111895 2573 scope.go:117] "RemoveContainer" containerID="d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5" Apr 24 21:49:40.120866 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.120848 2573 scope.go:117] "RemoveContainer" containerID="9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f" Apr 24 21:49:40.127468 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.127449 2573 scope.go:117] "RemoveContainer" containerID="d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5" Apr 24 21:49:40.127725 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:49:40.127707 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5\": container with ID starting with d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5 not found: ID does not exist" containerID="d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5" Apr 24 21:49:40.127787 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.127732 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5"} err="failed to get container status \"d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5\": rpc error: code = NotFound desc = could not find container \"d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5\": container with ID starting with d8f3471eb4775c8f4c00e4a9bc538eb2610f4f96ac692043d5e48c5cae1750b5 not found: ID does not exist" Apr 24 21:49:40.127787 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.127750 2573 scope.go:117] "RemoveContainer" containerID="9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f" Apr 24 21:49:40.127981 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:49:40.127964 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f\": container with ID starting with 9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f not found: ID does not exist" containerID="9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f" Apr 24 21:49:40.128036 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.127986 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f"} err="failed to get container status \"9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f\": rpc error: code = NotFound desc = could not find container \"9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f\": container with ID starting with 9bc2706c2a2d90dfda79eb1fed47f06374a7ba6680866f613ad5344267454a7f not found: ID does not exist" Apr 24 21:49:40.146417 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.146399 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8gwk\" (UniqueName: \"kubernetes.io/projected/95c6330c-680a-4894-ab45-4566cb30ef16-kube-api-access-w8gwk\") pod \"95c6330c-680a-4894-ab45-4566cb30ef16\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " Apr 24 21:49:40.146513 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.146450 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95c6330c-680a-4894-ab45-4566cb30ef16-proxy-tls\") pod \"95c6330c-680a-4894-ab45-4566cb30ef16\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " Apr 24 21:49:40.146513 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.146495 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-error-404-isvc-b9878-kube-rbac-proxy-sar-config\") pod \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " Apr 24 21:49:40.146593 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.146519 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95c6330c-680a-4894-ab45-4566cb30ef16-success-200-isvc-b9878-kube-rbac-proxy-sar-config\") pod \"95c6330c-680a-4894-ab45-4566cb30ef16\" (UID: \"95c6330c-680a-4894-ab45-4566cb30ef16\") " Apr 24 21:49:40.146593 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.146567 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-proxy-tls\") pod \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " Apr 24 21:49:40.146690 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.146597 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bcxw\" (UniqueName: \"kubernetes.io/projected/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-kube-api-access-4bcxw\") pod \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\" (UID: \"9a2cafa0-f418-40ed-a22a-f8abd1a4016b\") " Apr 24 21:49:40.146865 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.146843 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-error-404-isvc-b9878-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-b9878-kube-rbac-proxy-sar-config") pod "9a2cafa0-f418-40ed-a22a-f8abd1a4016b" (UID: "9a2cafa0-f418-40ed-a22a-f8abd1a4016b"). InnerVolumeSpecName "error-404-isvc-b9878-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:49:40.146953 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.146886 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c6330c-680a-4894-ab45-4566cb30ef16-success-200-isvc-b9878-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-b9878-kube-rbac-proxy-sar-config") pod "95c6330c-680a-4894-ab45-4566cb30ef16" (UID: "95c6330c-680a-4894-ab45-4566cb30ef16"). InnerVolumeSpecName "success-200-isvc-b9878-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:49:40.148535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.148510 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c6330c-680a-4894-ab45-4566cb30ef16-kube-api-access-w8gwk" (OuterVolumeSpecName: "kube-api-access-w8gwk") pod "95c6330c-680a-4894-ab45-4566cb30ef16" (UID: "95c6330c-680a-4894-ab45-4566cb30ef16"). InnerVolumeSpecName "kube-api-access-w8gwk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:40.148652 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.148632 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9a2cafa0-f418-40ed-a22a-f8abd1a4016b" (UID: "9a2cafa0-f418-40ed-a22a-f8abd1a4016b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:40.148711 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.148637 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-kube-api-access-4bcxw" (OuterVolumeSpecName: "kube-api-access-4bcxw") pod "9a2cafa0-f418-40ed-a22a-f8abd1a4016b" (UID: "9a2cafa0-f418-40ed-a22a-f8abd1a4016b"). InnerVolumeSpecName "kube-api-access-4bcxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:40.148711 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.148679 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c6330c-680a-4894-ab45-4566cb30ef16-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "95c6330c-680a-4894-ab45-4566cb30ef16" (UID: "95c6330c-680a-4894-ab45-4566cb30ef16"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:40.247494 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.247459 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:40.247494 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.247488 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4bcxw\" (UniqueName: \"kubernetes.io/projected/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-kube-api-access-4bcxw\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:40.247494 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.247500 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8gwk\" (UniqueName: \"kubernetes.io/projected/95c6330c-680a-4894-ab45-4566cb30ef16-kube-api-access-w8gwk\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:40.247763 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.247514 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95c6330c-680a-4894-ab45-4566cb30ef16-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:40.247763 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.247528 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a2cafa0-f418-40ed-a22a-f8abd1a4016b-error-404-isvc-b9878-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:40.247763 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.247542 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-b9878-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95c6330c-680a-4894-ab45-4566cb30ef16-success-200-isvc-b9878-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:49:40.421344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.421313 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7"] Apr 24 21:49:40.423977 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.423952 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7"] Apr 24 21:49:40.434363 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.434339 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb"] Apr 24 21:49:40.435983 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:40.435962 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb"] Apr 24 21:49:41.368579 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:41.368546 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" path="/var/lib/kubelet/pods/95c6330c-680a-4894-ab45-4566cb30ef16/volumes" Apr 24 21:49:41.368985 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:41.368971 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" path="/var/lib/kubelet/pods/9a2cafa0-f418-40ed-a22a-f8abd1a4016b/volumes" Apr 24 21:49:43.937373 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:43.937332 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 21:49:44.095259 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:44.095228 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:49:44.095801 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:44.095776 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:49:44.944241 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:44.944196 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 21:49:45.102237 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:45.102208 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:49:45.102645 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:45.102616 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:49:53.937760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:53.937729 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:49:54.096388 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:54.096348 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:49:54.944905 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:54.944872 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:49:55.103512 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:49:55.103475 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:50:04.095868 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:50:04.095827 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:50:05.103488 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:50:05.103447 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:50:14.095936 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:50:14.095882 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:50:15.103140 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:50:15.103088 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 21:50:24.096716 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:50:24.096686 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:50:25.104073 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:50:25.104046 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:51:55.348465 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:51:55.348432 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:51:55.350798 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:51:55.350774 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:56:55.377271 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:55.377240 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:56:55.379962 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:55.379944 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 21:58:21.231236 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.231201 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq"] Apr 24 21:58:21.231738 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.231609 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" containerID="cri-o://57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061" gracePeriod=30 Apr 24 21:58:21.231738 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.231638 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kube-rbac-proxy" containerID="cri-o://dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05" gracePeriod=30 Apr 24 21:58:21.309421 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.309389 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx"] Apr 24 21:58:21.309758 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.309724 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" containerID="cri-o://a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a" gracePeriod=30 Apr 24 21:58:21.309896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.309796 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kube-rbac-proxy" containerID="cri-o://91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa" gracePeriod=30 Apr 24 21:58:21.331439 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331413 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv"] Apr 24 21:58:21.331852 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331836 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kube-rbac-proxy" Apr 24 21:58:21.331901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331854 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kube-rbac-proxy" Apr 24 21:58:21.331901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331867 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" Apr 24 21:58:21.331901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331873 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" Apr 24 21:58:21.331901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331879 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" Apr 24 21:58:21.331901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331884 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" Apr 24 21:58:21.331901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331891 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kube-rbac-proxy" Apr 24 21:58:21.331901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331896 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kube-rbac-proxy" Apr 24 21:58:21.332146 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331966 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kube-rbac-proxy" Apr 24 21:58:21.332146 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331977 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kube-rbac-proxy" Apr 24 21:58:21.332146 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331987 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="95c6330c-680a-4894-ab45-4566cb30ef16" containerName="kserve-container" Apr 24 21:58:21.332146 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.331995 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a2cafa0-f418-40ed-a22a-f8abd1a4016b" containerName="kserve-container" Apr 24 21:58:21.335525 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.335503 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:21.338032 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.338003 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a4170-kube-rbac-proxy-sar-config\"" Apr 24 21:58:21.338473 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.338299 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a4170-predictor-serving-cert\"" Apr 24 21:58:21.356583 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.356559 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv"] Apr 24 21:58:21.406497 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.406455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bmdj\" (UniqueName: \"kubernetes.io/projected/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-kube-api-access-5bmdj\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:21.406681 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.406512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-success-200-isvc-a4170-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:21.406681 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.406610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-proxy-tls\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:21.428888 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.428858 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg"] Apr 24 21:58:21.432436 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.432417 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.438443 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.438423 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a4170-predictor-serving-cert\"" Apr 24 21:58:21.438558 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.438453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a4170-kube-rbac-proxy-sar-config\"" Apr 24 21:58:21.453154 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.453131 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg"] Apr 24 21:58:21.507359 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.507335 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-proxy-tls\") pod \"error-404-isvc-a4170-predictor-59dc6578db-4k6pg\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.507533 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.507374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bmdj\" (UniqueName: \"kubernetes.io/projected/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-kube-api-access-5bmdj\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:21.507533 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.507397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-success-200-isvc-a4170-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:21.507533 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.507451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-error-404-isvc-a4170-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a4170-predictor-59dc6578db-4k6pg\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.507703 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.507574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-proxy-tls\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:21.507703 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.507615 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdnph\" (UniqueName: \"kubernetes.io/projected/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-kube-api-access-kdnph\") pod \"error-404-isvc-a4170-predictor-59dc6578db-4k6pg\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.507810 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:21.507716 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-a4170-predictor-serving-cert: secret "success-200-isvc-a4170-predictor-serving-cert" not found Apr 24 21:58:21.507810 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:21.507779 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-proxy-tls podName:8757ba4b-dc3a-4ef0-9c03-f465879a9ad7 nodeName:}" failed. No retries permitted until 2026-04-24 21:58:22.007758274 +0000 UTC m=+1887.287606051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-proxy-tls") pod "success-200-isvc-a4170-predictor-c76b89db5-hv7sv" (UID: "8757ba4b-dc3a-4ef0-9c03-f465879a9ad7") : secret "success-200-isvc-a4170-predictor-serving-cert" not found Apr 24 21:58:21.508179 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.508161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-success-200-isvc-a4170-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:21.529094 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.529066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bmdj\" (UniqueName: \"kubernetes.io/projected/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-kube-api-access-5bmdj\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:21.608727 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.608695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-error-404-isvc-a4170-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a4170-predictor-59dc6578db-4k6pg\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.608896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.608777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdnph\" (UniqueName: \"kubernetes.io/projected/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-kube-api-access-kdnph\") pod \"error-404-isvc-a4170-predictor-59dc6578db-4k6pg\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.608896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.608827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-proxy-tls\") pod \"error-404-isvc-a4170-predictor-59dc6578db-4k6pg\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.609454 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.609432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-error-404-isvc-a4170-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a4170-predictor-59dc6578db-4k6pg\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.611320 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.611296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-proxy-tls\") pod \"error-404-isvc-a4170-predictor-59dc6578db-4k6pg\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.621879 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.621850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdnph\" (UniqueName: \"kubernetes.io/projected/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-kube-api-access-kdnph\") pod \"error-404-isvc-a4170-predictor-59dc6578db-4k6pg\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.744136 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.744099 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:21.873195 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.873155 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg"] Apr 24 21:58:21.875495 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:58:21.875462 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cfe76c8_2cae_4af0_baca_9c7e7c62c3a2.slice/crio-3d889748a96b551764d8eca1e518ea4c357000db2203a17a551e204e0dc8fe18 WatchSource:0}: Error finding container 3d889748a96b551764d8eca1e518ea4c357000db2203a17a551e204e0dc8fe18: Status 404 returned error can't find the container with id 3d889748a96b551764d8eca1e518ea4c357000db2203a17a551e204e0dc8fe18 Apr 24 21:58:21.877154 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.877140 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:58:21.889526 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.889494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" event={"ID":"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2","Type":"ContainerStarted","Data":"3d889748a96b551764d8eca1e518ea4c357000db2203a17a551e204e0dc8fe18"} Apr 24 21:58:21.891245 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.891224 2573 generic.go:358] "Generic (PLEG): container finished" podID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerID="dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05" exitCode=2 Apr 24 21:58:21.891344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.891292 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" event={"ID":"eda051e5-7bbf-41a9-ad1b-c9d1e5368798","Type":"ContainerDied","Data":"dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05"} Apr 24 21:58:21.892880 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.892858 2573 generic.go:358] "Generic (PLEG): container finished" podID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerID="91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa" exitCode=2 Apr 24 21:58:21.893003 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:21.892901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" event={"ID":"ea3669b0-a0cd-418f-9778-36c1b7855246","Type":"ContainerDied","Data":"91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa"} Apr 24 21:58:22.012994 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.012968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-proxy-tls\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:22.015136 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.015106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-proxy-tls\") pod \"success-200-isvc-a4170-predictor-c76b89db5-hv7sv\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:22.248872 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.248774 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:22.378314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.378228 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv"] Apr 24 21:58:22.381083 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:58:22.381045 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8757ba4b_dc3a_4ef0_9c03_f465879a9ad7.slice/crio-44dc951a9ba27b6e3dd93e1d6d0f3d53f8f39b3efee221780b9bd80ec87d99ad WatchSource:0}: Error finding container 44dc951a9ba27b6e3dd93e1d6d0f3d53f8f39b3efee221780b9bd80ec87d99ad: Status 404 returned error can't find the container with id 44dc951a9ba27b6e3dd93e1d6d0f3d53f8f39b3efee221780b9bd80ec87d99ad Apr 24 21:58:22.898280 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.898241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" event={"ID":"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2","Type":"ContainerStarted","Data":"e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9"} Apr 24 21:58:22.898479 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.898287 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" event={"ID":"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2","Type":"ContainerStarted","Data":"56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c"} Apr 24 21:58:22.898479 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.898380 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:22.899750 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.899726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" event={"ID":"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7","Type":"ContainerStarted","Data":"735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92"} Apr 24 21:58:22.899863 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.899752 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" event={"ID":"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7","Type":"ContainerStarted","Data":"33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d"} Apr 24 21:58:22.899863 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.899763 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" event={"ID":"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7","Type":"ContainerStarted","Data":"44dc951a9ba27b6e3dd93e1d6d0f3d53f8f39b3efee221780b9bd80ec87d99ad"} Apr 24 21:58:22.899976 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.899876 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:22.922211 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.922163 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podStartSLOduration=1.922147522 podStartE2EDuration="1.922147522s" podCreationTimestamp="2026-04-24 21:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:58:22.919884103 +0000 UTC m=+1888.199731897" watchObservedRunningTime="2026-04-24 21:58:22.922147522 +0000 UTC m=+1888.201995303" Apr 24 21:58:22.939533 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:22.939493 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podStartSLOduration=1.939481698 podStartE2EDuration="1.939481698s" podCreationTimestamp="2026-04-24 21:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:58:22.938214069 +0000 UTC m=+1888.218061850" watchObservedRunningTime="2026-04-24 21:58:22.939481698 +0000 UTC m=+1888.219329530" Apr 24 21:58:23.903824 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:23.903793 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:23.903824 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:23.903828 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:23.904969 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:23.904940 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:23.905080 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:23.904966 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 21:58:23.933182 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:23.933156 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.47:8643/healthz\": dial tcp 10.134.0.47:8643: connect: connection refused" Apr 24 21:58:23.937427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:23.937404 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 21:58:24.683077 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.683056 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:58:24.735697 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.735672 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-proxy-tls\") pod \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " Apr 24 21:58:24.735857 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.735762 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-success-200-isvc-1dabb-kube-rbac-proxy-sar-config\") pod \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " Apr 24 21:58:24.735963 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.735940 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7drvt\" (UniqueName: \"kubernetes.io/projected/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-kube-api-access-7drvt\") pod \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\" (UID: \"eda051e5-7bbf-41a9-ad1b-c9d1e5368798\") " Apr 24 21:58:24.736089 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.736067 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-success-200-isvc-1dabb-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-1dabb-kube-rbac-proxy-sar-config") pod "eda051e5-7bbf-41a9-ad1b-c9d1e5368798" (UID: "eda051e5-7bbf-41a9-ad1b-c9d1e5368798"). InnerVolumeSpecName "success-200-isvc-1dabb-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:24.736305 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.736249 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-success-200-isvc-1dabb-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:24.737862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.737834 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "eda051e5-7bbf-41a9-ad1b-c9d1e5368798" (UID: "eda051e5-7bbf-41a9-ad1b-c9d1e5368798"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:58:24.737862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.737839 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-kube-api-access-7drvt" (OuterVolumeSpecName: "kube-api-access-7drvt") pod "eda051e5-7bbf-41a9-ad1b-c9d1e5368798" (UID: "eda051e5-7bbf-41a9-ad1b-c9d1e5368798"). InnerVolumeSpecName "kube-api-access-7drvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:58:24.751389 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.751371 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:58:24.836509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.836478 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3669b0-a0cd-418f-9778-36c1b7855246-proxy-tls\") pod \"ea3669b0-a0cd-418f-9778-36c1b7855246\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " Apr 24 21:58:24.836692 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.836550 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ggpj\" (UniqueName: \"kubernetes.io/projected/ea3669b0-a0cd-418f-9778-36c1b7855246-kube-api-access-2ggpj\") pod \"ea3669b0-a0cd-418f-9778-36c1b7855246\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " Apr 24 21:58:24.836692 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.836598 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3669b0-a0cd-418f-9778-36c1b7855246-error-404-isvc-1dabb-kube-rbac-proxy-sar-config\") pod \"ea3669b0-a0cd-418f-9778-36c1b7855246\" (UID: \"ea3669b0-a0cd-418f-9778-36c1b7855246\") " Apr 24 21:58:24.836813 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.836739 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:24.836813 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.836750 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7drvt\" (UniqueName: \"kubernetes.io/projected/eda051e5-7bbf-41a9-ad1b-c9d1e5368798-kube-api-access-7drvt\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:24.837071 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.837034 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea3669b0-a0cd-418f-9778-36c1b7855246-error-404-isvc-1dabb-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-1dabb-kube-rbac-proxy-sar-config") pod "ea3669b0-a0cd-418f-9778-36c1b7855246" (UID: "ea3669b0-a0cd-418f-9778-36c1b7855246"). InnerVolumeSpecName "error-404-isvc-1dabb-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:24.838567 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.838544 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3669b0-a0cd-418f-9778-36c1b7855246-kube-api-access-2ggpj" (OuterVolumeSpecName: "kube-api-access-2ggpj") pod "ea3669b0-a0cd-418f-9778-36c1b7855246" (UID: "ea3669b0-a0cd-418f-9778-36c1b7855246"). InnerVolumeSpecName "kube-api-access-2ggpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:58:24.838662 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.838584 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3669b0-a0cd-418f-9778-36c1b7855246-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ea3669b0-a0cd-418f-9778-36c1b7855246" (UID: "ea3669b0-a0cd-418f-9778-36c1b7855246"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:58:24.909099 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.909060 2573 generic.go:358] "Generic (PLEG): container finished" podID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerID="57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061" exitCode=0 Apr 24 21:58:24.909539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.909141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" event={"ID":"eda051e5-7bbf-41a9-ad1b-c9d1e5368798","Type":"ContainerDied","Data":"57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061"} Apr 24 21:58:24.909539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.909170 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" event={"ID":"eda051e5-7bbf-41a9-ad1b-c9d1e5368798","Type":"ContainerDied","Data":"da265a018bb05713c62f68c7b55819c398079c2c64b9c03cee0173a92d191e2b"} Apr 24 21:58:24.909539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.909181 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq" Apr 24 21:58:24.909539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.909189 2573 scope.go:117] "RemoveContainer" containerID="dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05" Apr 24 21:58:24.910783 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.910757 2573 generic.go:358] "Generic (PLEG): container finished" podID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerID="a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a" exitCode=0 Apr 24 21:58:24.910909 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.910854 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" event={"ID":"ea3669b0-a0cd-418f-9778-36c1b7855246","Type":"ContainerDied","Data":"a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a"} Apr 24 21:58:24.910909 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.910897 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" Apr 24 21:58:24.911096 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.910910 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx" event={"ID":"ea3669b0-a0cd-418f-9778-36c1b7855246","Type":"ContainerDied","Data":"59c862b99d7f33ae99fd4ba63f80ded169c83045be5223daccc662c11e51b88c"} Apr 24 21:58:24.911635 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.911602 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:24.911753 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.911721 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 21:58:24.917908 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.917883 2573 scope.go:117] "RemoveContainer" containerID="57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061" Apr 24 21:58:24.925778 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.925760 2573 scope.go:117] "RemoveContainer" containerID="dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05" Apr 24 21:58:24.926071 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:24.926051 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05\": container with ID starting with dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05 not found: ID does not exist" containerID="dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05" Apr 24 21:58:24.926148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.926078 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05"} err="failed to get container status \"dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05\": rpc error: code = NotFound desc = could not find container \"dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05\": container with ID starting with dddea1daded3110cd27afb44966b7616d8274f6dc0d44e4f59d0b44832bb4b05 not found: ID does not exist" Apr 24 21:58:24.926148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.926094 2573 scope.go:117] "RemoveContainer" containerID="57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061" Apr 24 21:58:24.926384 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:24.926363 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061\": container with ID starting with 57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061 not found: ID does not exist" containerID="57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061" Apr 24 21:58:24.926453 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.926390 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061"} err="failed to get container status \"57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061\": rpc error: code = NotFound desc = could not find container \"57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061\": container with ID starting with 57c6da30729f3b3f58baec90111717758f23c2c1205fb4b0fdb356e007588061 not found: ID does not exist" Apr 24 21:58:24.926453 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.926405 2573 scope.go:117] "RemoveContainer" containerID="91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa" Apr 24 21:58:24.933565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.933551 2573 scope.go:117] "RemoveContainer" containerID="a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a" Apr 24 21:58:24.938139 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.938119 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq"] Apr 24 21:58:24.938330 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.938307 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-1dabb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3669b0-a0cd-418f-9778-36c1b7855246-error-404-isvc-1dabb-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:24.938330 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.938330 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3669b0-a0cd-418f-9778-36c1b7855246-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:24.938503 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.938340 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ggpj\" (UniqueName: \"kubernetes.io/projected/ea3669b0-a0cd-418f-9778-36c1b7855246-kube-api-access-2ggpj\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:24.941884 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.941865 2573 scope.go:117] "RemoveContainer" containerID="91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa" Apr 24 21:58:24.942197 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:24.942179 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa\": container with ID starting with 91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa not found: ID does not exist" containerID="91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa" Apr 24 21:58:24.942271 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.942204 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa"} err="failed to get container status \"91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa\": rpc error: code = NotFound desc = could not find container \"91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa\": container with ID starting with 91d13117852ae7da586c52b7c7cc577134b23a8d52e31d3173174beb11a5ecaa not found: ID does not exist" Apr 24 21:58:24.942271 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.942221 2573 scope.go:117] "RemoveContainer" containerID="a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a" Apr 24 21:58:24.942472 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:24.942456 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a\": container with ID starting with a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a not found: ID does not exist" containerID="a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a" Apr 24 21:58:24.942517 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.942477 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a"} err="failed to get container status \"a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a\": rpc error: code = NotFound desc = could not find container \"a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a\": container with ID starting with a238c5531958e18130383d9acd712d7dad53d8c804445152ac34292d7cce140a not found: ID does not exist" Apr 24 21:58:24.943558 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.943540 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq"] Apr 24 21:58:24.956582 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.956559 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx"] Apr 24 21:58:24.959170 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:24.959150 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx"] Apr 24 21:58:25.369817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:25.369782 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" path="/var/lib/kubelet/pods/ea3669b0-a0cd-418f-9778-36c1b7855246/volumes" Apr 24 21:58:25.370250 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:25.370235 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" path="/var/lib/kubelet/pods/eda051e5-7bbf-41a9-ad1b-c9d1e5368798/volumes" Apr 24 21:58:29.915365 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:29.915336 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:58:29.915935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:29.915392 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:58:29.915935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:29.915793 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 21:58:29.916079 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:29.916024 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:39.916024 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:39.915959 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 21:58:39.916024 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:39.916019 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:49.915789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:49.915745 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 21:58:49.916291 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:49.916023 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:51.114250 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.114214 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms"] Apr 24 21:58:51.114648 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.114461 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" containerID="cri-o://e92c1e72ed774b6f090042efe8008ff858a194ab31425464c0fbb80b47a5b092" gracePeriod=30 Apr 24 21:58:51.114648 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.114486 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kube-rbac-proxy" containerID="cri-o://d4df6428c0b3af2192ec3bff6978d8956f21067bd3a192561bf67e220b84bf08" gracePeriod=30 Apr 24 21:58:51.192645 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.192612 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9"] Apr 24 21:58:51.193250 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193214 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" Apr 24 21:58:51.193250 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193237 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" Apr 24 21:58:51.193455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193293 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kube-rbac-proxy" Apr 24 21:58:51.193455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193303 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kube-rbac-proxy" Apr 24 21:58:51.193455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193319 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" Apr 24 21:58:51.193455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193328 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" Apr 24 21:58:51.193455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193342 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kube-rbac-proxy" Apr 24 21:58:51.193455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193350 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kube-rbac-proxy" Apr 24 21:58:51.193455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193427 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kube-rbac-proxy" Apr 24 21:58:51.193455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193443 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kube-rbac-proxy" Apr 24 21:58:51.193455 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193454 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="eda051e5-7bbf-41a9-ad1b-c9d1e5368798" containerName="kserve-container" Apr 24 21:58:51.193911 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.193466 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea3669b0-a0cd-418f-9778-36c1b7855246" containerName="kserve-container" Apr 24 21:58:51.198681 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.198657 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.201234 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.201208 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f4deb-predictor-serving-cert\"" Apr 24 21:58:51.204029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.204009 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f4deb-kube-rbac-proxy-sar-config\"" Apr 24 21:58:51.211569 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.211546 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9"] Apr 24 21:58:51.225240 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.225064 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx"] Apr 24 21:58:51.225474 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.225431 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" containerID="cri-o://b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98" gracePeriod=30 Apr 24 21:58:51.225575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.225471 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kube-rbac-proxy" containerID="cri-o://ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5" gracePeriod=30 Apr 24 21:58:51.331791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.331758 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6"] Apr 24 21:58:51.335443 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.335421 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.337953 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.337912 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f4deb-predictor-serving-cert\"" Apr 24 21:58:51.338068 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.338015 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f4deb-kube-rbac-proxy-sar-config\"" Apr 24 21:58:51.349140 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.349116 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6"] Apr 24 21:58:51.372192 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.372127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab9fee25-cf0b-4243-9b8d-702558af3735-success-200-isvc-f4deb-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.372192 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.372158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab9fee25-cf0b-4243-9b8d-702558af3735-proxy-tls\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.372192 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.372178 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr5m7\" (UniqueName: \"kubernetes.io/projected/ab9fee25-cf0b-4243-9b8d-702558af3735-kube-api-access-vr5m7\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.473408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.473369 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab9fee25-cf0b-4243-9b8d-702558af3735-success-200-isvc-f4deb-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.473408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.473408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab9fee25-cf0b-4243-9b8d-702558af3735-proxy-tls\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.473673 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.473433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr5m7\" (UniqueName: \"kubernetes.io/projected/ab9fee25-cf0b-4243-9b8d-702558af3735-kube-api-access-vr5m7\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.473673 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.473522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-proxy-tls\") pod \"error-404-isvc-f4deb-predictor-c578cbf66-dkmz6\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.473673 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:51.473540 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-serving-cert: secret "success-200-isvc-f4deb-predictor-serving-cert" not found Apr 24 21:58:51.473673 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:51.473605 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab9fee25-cf0b-4243-9b8d-702558af3735-proxy-tls podName:ab9fee25-cf0b-4243-9b8d-702558af3735 nodeName:}" failed. No retries permitted until 2026-04-24 21:58:51.973586462 +0000 UTC m=+1917.253434232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ab9fee25-cf0b-4243-9b8d-702558af3735-proxy-tls") pod "success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" (UID: "ab9fee25-cf0b-4243-9b8d-702558af3735") : secret "success-200-isvc-f4deb-predictor-serving-cert" not found Apr 24 21:58:51.473896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.473721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnrrz\" (UniqueName: \"kubernetes.io/projected/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-kube-api-access-fnrrz\") pod \"error-404-isvc-f4deb-predictor-c578cbf66-dkmz6\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.473896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.473776 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-error-404-isvc-f4deb-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f4deb-predictor-c578cbf66-dkmz6\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.474185 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.474161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab9fee25-cf0b-4243-9b8d-702558af3735-success-200-isvc-f4deb-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.490570 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.490537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr5m7\" (UniqueName: \"kubernetes.io/projected/ab9fee25-cf0b-4243-9b8d-702558af3735-kube-api-access-vr5m7\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.574858 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.574818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-error-404-isvc-f4deb-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f4deb-predictor-c578cbf66-dkmz6\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.575048 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.574981 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-proxy-tls\") pod \"error-404-isvc-f4deb-predictor-c578cbf66-dkmz6\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.575048 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.575041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnrrz\" (UniqueName: \"kubernetes.io/projected/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-kube-api-access-fnrrz\") pod \"error-404-isvc-f4deb-predictor-c578cbf66-dkmz6\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.575617 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.575589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-error-404-isvc-f4deb-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f4deb-predictor-c578cbf66-dkmz6\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.578540 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.578519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-proxy-tls\") pod \"error-404-isvc-f4deb-predictor-c578cbf66-dkmz6\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.589653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.587905 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnrrz\" (UniqueName: \"kubernetes.io/projected/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-kube-api-access-fnrrz\") pod \"error-404-isvc-f4deb-predictor-c578cbf66-dkmz6\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.652829 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.652741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:51.977572 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.977485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab9fee25-cf0b-4243-9b8d-702558af3735-proxy-tls\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.980244 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.980202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab9fee25-cf0b-4243-9b8d-702558af3735-proxy-tls\") pod \"success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:51.990075 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:51.990051 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6"] Apr 24 21:58:51.992754 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:58:51.992725 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ad1d5e7_a76a_4378_b25d_eaeeea0f959f.slice/crio-3093ce63d0f35fd09e61a70fb3f19b3f54f3360dd4d291c90ab0a03e5f152b7f WatchSource:0}: Error finding container 3093ce63d0f35fd09e61a70fb3f19b3f54f3360dd4d291c90ab0a03e5f152b7f: Status 404 returned error can't find the container with id 3093ce63d0f35fd09e61a70fb3f19b3f54f3360dd4d291c90ab0a03e5f152b7f Apr 24 21:58:52.023352 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:52.023314 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" event={"ID":"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f","Type":"ContainerStarted","Data":"3093ce63d0f35fd09e61a70fb3f19b3f54f3360dd4d291c90ab0a03e5f152b7f"} Apr 24 21:58:52.025246 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:52.025219 2573 generic.go:358] "Generic (PLEG): container finished" podID="59803781-86f5-4c54-91c2-18922795d983" containerID="d4df6428c0b3af2192ec3bff6978d8956f21067bd3a192561bf67e220b84bf08" exitCode=2 Apr 24 21:58:52.025359 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:52.025287 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" event={"ID":"59803781-86f5-4c54-91c2-18922795d983","Type":"ContainerDied","Data":"d4df6428c0b3af2192ec3bff6978d8956f21067bd3a192561bf67e220b84bf08"} Apr 24 21:58:52.027067 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:52.027048 2573 generic.go:358] "Generic (PLEG): container finished" podID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerID="ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5" exitCode=2 Apr 24 21:58:52.027174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:52.027080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" event={"ID":"22ee0cfe-06f5-47f0-99f6-a77f434afb7c","Type":"ContainerDied","Data":"ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5"} Apr 24 21:58:52.110773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:52.110749 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:52.246412 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:52.246387 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9"] Apr 24 21:58:52.249095 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:58:52.249049 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab9fee25_cf0b_4243_9b8d_702558af3735.slice/crio-7af88b818194d6f92acf677d2faffc4f9e311980426cf2ca427e6bae49490e3f WatchSource:0}: Error finding container 7af88b818194d6f92acf677d2faffc4f9e311980426cf2ca427e6bae49490e3f: Status 404 returned error can't find the container with id 7af88b818194d6f92acf677d2faffc4f9e311980426cf2ca427e6bae49490e3f Apr 24 21:58:53.032092 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.032054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" event={"ID":"ab9fee25-cf0b-4243-9b8d-702558af3735","Type":"ContainerStarted","Data":"2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b"} Apr 24 21:58:53.032092 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.032091 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" event={"ID":"ab9fee25-cf0b-4243-9b8d-702558af3735","Type":"ContainerStarted","Data":"f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8"} Apr 24 21:58:53.032092 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.032102 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" event={"ID":"ab9fee25-cf0b-4243-9b8d-702558af3735","Type":"ContainerStarted","Data":"7af88b818194d6f92acf677d2faffc4f9e311980426cf2ca427e6bae49490e3f"} Apr 24 21:58:53.032379 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.032203 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:53.033722 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.033698 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" event={"ID":"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f","Type":"ContainerStarted","Data":"9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3"} Apr 24 21:58:53.033833 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.033726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" event={"ID":"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f","Type":"ContainerStarted","Data":"0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9"} Apr 24 21:58:53.033872 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.033853 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:53.033872 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.033865 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:53.035161 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.035139 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 21:58:53.058523 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.058478 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" podStartSLOduration=2.058465773 podStartE2EDuration="2.058465773s" podCreationTimestamp="2026-04-24 21:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:58:53.056080007 +0000 UTC m=+1918.335927803" watchObservedRunningTime="2026-04-24 21:58:53.058465773 +0000 UTC m=+1918.338313553" Apr 24 21:58:53.082846 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:53.082788 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" podStartSLOduration=2.082772193 podStartE2EDuration="2.082772193s" podCreationTimestamp="2026-04-24 21:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:58:53.079802834 +0000 UTC m=+1918.359650617" watchObservedRunningTime="2026-04-24 21:58:53.082772193 +0000 UTC m=+1918.362619976" Apr 24 21:58:54.038616 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.038585 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:58:54.039125 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.038572 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 21:58:54.039797 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.039771 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 21:58:54.091550 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.091510 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.48:8643/healthz\": dial tcp 10.134.0.48:8643: connect: connection refused" Apr 24 21:58:54.095932 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.095895 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 21:58:54.785692 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.785660 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:58:54.904500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.904463 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvdqc\" (UniqueName: \"kubernetes.io/projected/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-kube-api-access-dvdqc\") pod \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " Apr 24 21:58:54.904634 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.904527 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\") pod \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " Apr 24 21:58:54.904634 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.904558 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-proxy-tls\") pod \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\" (UID: \"22ee0cfe-06f5-47f0-99f6-a77f434afb7c\") " Apr 24 21:58:54.904944 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.904899 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-error-404-isvc-1ec3f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-1ec3f-kube-rbac-proxy-sar-config") pod "22ee0cfe-06f5-47f0-99f6-a77f434afb7c" (UID: "22ee0cfe-06f5-47f0-99f6-a77f434afb7c"). InnerVolumeSpecName "error-404-isvc-1ec3f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:54.906560 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.906538 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "22ee0cfe-06f5-47f0-99f6-a77f434afb7c" (UID: "22ee0cfe-06f5-47f0-99f6-a77f434afb7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:58:54.906621 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:54.906590 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-kube-api-access-dvdqc" (OuterVolumeSpecName: "kube-api-access-dvdqc") pod "22ee0cfe-06f5-47f0-99f6-a77f434afb7c" (UID: "22ee0cfe-06f5-47f0-99f6-a77f434afb7c"). InnerVolumeSpecName "kube-api-access-dvdqc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:58:55.005715 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.005684 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dvdqc\" (UniqueName: \"kubernetes.io/projected/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-kube-api-access-dvdqc\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:55.005715 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.005715 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-error-404-isvc-1ec3f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:55.005991 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.005728 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22ee0cfe-06f5-47f0-99f6-a77f434afb7c-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:55.043308 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.043274 2573 generic.go:358] "Generic (PLEG): container finished" podID="59803781-86f5-4c54-91c2-18922795d983" containerID="e92c1e72ed774b6f090042efe8008ff858a194ab31425464c0fbb80b47a5b092" exitCode=0 Apr 24 21:58:55.043650 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.043322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" event={"ID":"59803781-86f5-4c54-91c2-18922795d983","Type":"ContainerDied","Data":"e92c1e72ed774b6f090042efe8008ff858a194ab31425464c0fbb80b47a5b092"} Apr 24 21:58:55.044812 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.044791 2573 generic.go:358] "Generic (PLEG): container finished" podID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerID="b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98" exitCode=0 Apr 24 21:58:55.044946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.044863 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" event={"ID":"22ee0cfe-06f5-47f0-99f6-a77f434afb7c","Type":"ContainerDied","Data":"b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98"} Apr 24 21:58:55.044946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.044874 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" Apr 24 21:58:55.044946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.044899 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx" event={"ID":"22ee0cfe-06f5-47f0-99f6-a77f434afb7c","Type":"ContainerDied","Data":"fcba0c8add211164d12bdee750feb6485d3c54dc0af15ed082fc80d37cb5f113"} Apr 24 21:58:55.045098 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.044946 2573 scope.go:117] "RemoveContainer" containerID="ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5" Apr 24 21:58:55.045634 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.045609 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 21:58:55.053841 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.053824 2573 scope.go:117] "RemoveContainer" containerID="b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98" Apr 24 21:58:55.061329 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.061311 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:58:55.062538 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.062524 2573 scope.go:117] "RemoveContainer" containerID="ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5" Apr 24 21:58:55.062790 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:55.062772 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5\": container with ID starting with ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5 not found: ID does not exist" containerID="ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5" Apr 24 21:58:55.062836 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.062799 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5"} err="failed to get container status \"ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5\": rpc error: code = NotFound desc = could not find container \"ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5\": container with ID starting with ddf74cd4f7881d546004c7512472f0b7ea700f2202ce25100cc2f8abe6a691b5 not found: ID does not exist" Apr 24 21:58:55.062836 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.062817 2573 scope.go:117] "RemoveContainer" containerID="b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98" Apr 24 21:58:55.063104 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:58:55.063086 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98\": container with ID starting with b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98 not found: ID does not exist" containerID="b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98" Apr 24 21:58:55.063169 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.063115 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98"} err="failed to get container status \"b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98\": rpc error: code = NotFound desc = could not find container \"b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98\": container with ID starting with b155453d7693cf288e4d6cc1fcc07fe41669dc63688ec0142d2b9f94a8ba5e98 not found: ID does not exist" Apr 24 21:58:55.079265 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.079239 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx"] Apr 24 21:58:55.085113 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.085092 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx"] Apr 24 21:58:55.206994 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.206898 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59803781-86f5-4c54-91c2-18922795d983-success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\") pod \"59803781-86f5-4c54-91c2-18922795d983\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " Apr 24 21:58:55.207207 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.206994 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4csml\" (UniqueName: \"kubernetes.io/projected/59803781-86f5-4c54-91c2-18922795d983-kube-api-access-4csml\") pod \"59803781-86f5-4c54-91c2-18922795d983\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " Apr 24 21:58:55.207207 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.207041 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59803781-86f5-4c54-91c2-18922795d983-proxy-tls\") pod \"59803781-86f5-4c54-91c2-18922795d983\" (UID: \"59803781-86f5-4c54-91c2-18922795d983\") " Apr 24 21:58:55.207331 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.207284 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59803781-86f5-4c54-91c2-18922795d983-success-200-isvc-1ec3f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-1ec3f-kube-rbac-proxy-sar-config") pod "59803781-86f5-4c54-91c2-18922795d983" (UID: "59803781-86f5-4c54-91c2-18922795d983"). InnerVolumeSpecName "success-200-isvc-1ec3f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:55.207411 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.207393 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59803781-86f5-4c54-91c2-18922795d983-success-200-isvc-1ec3f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:55.209174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.209148 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59803781-86f5-4c54-91c2-18922795d983-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "59803781-86f5-4c54-91c2-18922795d983" (UID: "59803781-86f5-4c54-91c2-18922795d983"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:58:55.209265 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.209180 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59803781-86f5-4c54-91c2-18922795d983-kube-api-access-4csml" (OuterVolumeSpecName: "kube-api-access-4csml") pod "59803781-86f5-4c54-91c2-18922795d983" (UID: "59803781-86f5-4c54-91c2-18922795d983"). InnerVolumeSpecName "kube-api-access-4csml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:58:55.308317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.308286 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4csml\" (UniqueName: \"kubernetes.io/projected/59803781-86f5-4c54-91c2-18922795d983-kube-api-access-4csml\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:55.308317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.308312 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59803781-86f5-4c54-91c2-18922795d983-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:58:55.369105 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:55.369073 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" path="/var/lib/kubelet/pods/22ee0cfe-06f5-47f0-99f6-a77f434afb7c/volumes" Apr 24 21:58:56.050143 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:56.050106 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" event={"ID":"59803781-86f5-4c54-91c2-18922795d983","Type":"ContainerDied","Data":"0d3d4ef47b23648595aa5b47d55e474cc0074c0ac58d91fb1061d1db5a607cd2"} Apr 24 21:58:56.050143 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:56.050126 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms" Apr 24 21:58:56.050613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:56.050156 2573 scope.go:117] "RemoveContainer" containerID="d4df6428c0b3af2192ec3bff6978d8956f21067bd3a192561bf67e220b84bf08" Apr 24 21:58:56.058109 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:56.058092 2573 scope.go:117] "RemoveContainer" containerID="e92c1e72ed774b6f090042efe8008ff858a194ab31425464c0fbb80b47a5b092" Apr 24 21:58:56.072732 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:56.072707 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms"] Apr 24 21:58:56.076839 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:56.076816 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms"] Apr 24 21:58:57.368265 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:57.368185 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59803781-86f5-4c54-91c2-18922795d983" path="/var/lib/kubelet/pods/59803781-86f5-4c54-91c2-18922795d983/volumes" Apr 24 21:58:59.042138 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:59.042105 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:58:59.042653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:59.042630 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 21:58:59.916354 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:59.916318 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 21:58:59.916552 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:58:59.916317 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 21:59:00.049602 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:00.049570 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:59:00.050123 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:00.050096 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 21:59:09.042808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:09.042764 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 21:59:09.917069 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:09.917040 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:59:09.917249 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:09.917091 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:59:10.050108 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:10.050057 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 21:59:19.043195 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:19.043153 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 21:59:20.051006 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:20.050967 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 21:59:29.043226 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:29.043185 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 21:59:30.050741 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:30.050699 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 21:59:31.569403 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.569369 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv"] Apr 24 21:59:31.569912 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.569636 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" containerID="cri-o://33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d" gracePeriod=30 Apr 24 21:59:31.569912 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.569682 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kube-rbac-proxy" containerID="cri-o://735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92" gracePeriod=30 Apr 24 21:59:31.632025 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.631989 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv"] Apr 24 21:59:31.632380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632369 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kube-rbac-proxy" Apr 24 21:59:31.632426 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632382 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kube-rbac-proxy" Apr 24 21:59:31.632426 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632399 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kube-rbac-proxy" Apr 24 21:59:31.632426 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632405 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kube-rbac-proxy" Apr 24 21:59:31.632426 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632416 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" Apr 24 21:59:31.632426 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632421 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" Apr 24 21:59:31.632584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632431 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" Apr 24 21:59:31.632584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632437 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" Apr 24 21:59:31.632584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632497 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kserve-container" Apr 24 21:59:31.632584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632505 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kube-rbac-proxy" Apr 24 21:59:31.632584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632513 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="22ee0cfe-06f5-47f0-99f6-a77f434afb7c" containerName="kserve-container" Apr 24 21:59:31.632584 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.632521 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="59803781-86f5-4c54-91c2-18922795d983" containerName="kube-rbac-proxy" Apr 24 21:59:31.635618 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.635595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:31.642892 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.642838 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-600f1-predictor-serving-cert\"" Apr 24 21:59:31.647320 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.647299 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-600f1-kube-rbac-proxy-sar-config\"" Apr 24 21:59:31.661908 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.661885 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv"] Apr 24 21:59:31.730425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.730380 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84n6q\" (UniqueName: \"kubernetes.io/projected/26fbe127-e1af-4d43-a766-a61d05cee245-kube-api-access-84n6q\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:31.730669 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.730477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26fbe127-e1af-4d43-a766-a61d05cee245-proxy-tls\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:31.730669 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.730557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26fbe127-e1af-4d43-a766-a61d05cee245-success-200-isvc-600f1-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:31.780785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.780750 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg"] Apr 24 21:59:31.781174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.781121 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" containerID="cri-o://56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c" gracePeriod=30 Apr 24 21:59:31.781388 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.781165 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kube-rbac-proxy" containerID="cri-o://e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9" gracePeriod=30 Apr 24 21:59:31.831796 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.831717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26fbe127-e1af-4d43-a766-a61d05cee245-success-200-isvc-600f1-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:31.831796 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.831785 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84n6q\" (UniqueName: \"kubernetes.io/projected/26fbe127-e1af-4d43-a766-a61d05cee245-kube-api-access-84n6q\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:31.832028 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.831828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26fbe127-e1af-4d43-a766-a61d05cee245-proxy-tls\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:31.832028 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:59:31.831959 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-600f1-predictor-serving-cert: secret "success-200-isvc-600f1-predictor-serving-cert" not found Apr 24 21:59:31.832149 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:59:31.832040 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fbe127-e1af-4d43-a766-a61d05cee245-proxy-tls podName:26fbe127-e1af-4d43-a766-a61d05cee245 nodeName:}" failed. No retries permitted until 2026-04-24 21:59:32.332020451 +0000 UTC m=+1957.611868214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/26fbe127-e1af-4d43-a766-a61d05cee245-proxy-tls") pod "success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" (UID: "26fbe127-e1af-4d43-a766-a61d05cee245") : secret "success-200-isvc-600f1-predictor-serving-cert" not found Apr 24 21:59:31.832469 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.832432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26fbe127-e1af-4d43-a766-a61d05cee245-success-200-isvc-600f1-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:31.842248 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.842227 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj"] Apr 24 21:59:31.845885 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.845866 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:31.849525 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.849499 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-600f1-predictor-serving-cert\"" Apr 24 21:59:31.849628 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.849528 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-600f1-kube-rbac-proxy-sar-config\"" Apr 24 21:59:31.869592 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.869566 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj"] Apr 24 21:59:31.871728 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.871705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84n6q\" (UniqueName: \"kubernetes.io/projected/26fbe127-e1af-4d43-a766-a61d05cee245-kube-api-access-84n6q\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:31.932545 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.932511 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19adae02-71e4-493f-b151-31853b1d445d-proxy-tls\") pod \"error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:31.932697 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.932555 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19adae02-71e4-493f-b151-31853b1d445d-error-404-isvc-600f1-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:31.932697 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:31.932577 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncslc\" (UniqueName: \"kubernetes.io/projected/19adae02-71e4-493f-b151-31853b1d445d-kube-api-access-ncslc\") pod \"error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:32.033947 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.033901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19adae02-71e4-493f-b151-31853b1d445d-error-404-isvc-600f1-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:32.034127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.033957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncslc\" (UniqueName: \"kubernetes.io/projected/19adae02-71e4-493f-b151-31853b1d445d-kube-api-access-ncslc\") pod \"error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:32.034127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.034083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19adae02-71e4-493f-b151-31853b1d445d-proxy-tls\") pod \"error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:32.034590 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.034565 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19adae02-71e4-493f-b151-31853b1d445d-error-404-isvc-600f1-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:32.036384 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.036354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19adae02-71e4-493f-b151-31853b1d445d-proxy-tls\") pod \"error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:32.043344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.043316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncslc\" (UniqueName: \"kubernetes.io/projected/19adae02-71e4-493f-b151-31853b1d445d-kube-api-access-ncslc\") pod \"error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:32.158675 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.158574 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:32.191226 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.191190 2573 generic.go:358] "Generic (PLEG): container finished" podID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerID="e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9" exitCode=2 Apr 24 21:59:32.191459 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.191265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" event={"ID":"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2","Type":"ContainerDied","Data":"e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9"} Apr 24 21:59:32.192971 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.192940 2573 generic.go:358] "Generic (PLEG): container finished" podID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerID="735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92" exitCode=2 Apr 24 21:59:32.193098 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.192950 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" event={"ID":"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7","Type":"ContainerDied","Data":"735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92"} Apr 24 21:59:32.293572 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.293545 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj"] Apr 24 21:59:32.295541 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:59:32.295499 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19adae02_71e4_493f_b151_31853b1d445d.slice/crio-70a9cd6d23943226fa174db053801156b1d7ef58281503c664981be8ceb68d60 WatchSource:0}: Error finding container 70a9cd6d23943226fa174db053801156b1d7ef58281503c664981be8ceb68d60: Status 404 returned error can't find the container with id 70a9cd6d23943226fa174db053801156b1d7ef58281503c664981be8ceb68d60 Apr 24 21:59:32.336628 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.336600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26fbe127-e1af-4d43-a766-a61d05cee245-proxy-tls\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:32.338837 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.338817 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26fbe127-e1af-4d43-a766-a61d05cee245-proxy-tls\") pod \"success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:32.546086 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.546051 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:32.676792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:32.676762 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv"] Apr 24 21:59:32.680532 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:59:32.680475 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26fbe127_e1af_4d43_a766_a61d05cee245.slice/crio-be23aeb14dccf48c618f6ca8c745cfc8d7b5af622235491955d23a21ede247a5 WatchSource:0}: Error finding container be23aeb14dccf48c618f6ca8c745cfc8d7b5af622235491955d23a21ede247a5: Status 404 returned error can't find the container with id be23aeb14dccf48c618f6ca8c745cfc8d7b5af622235491955d23a21ede247a5 Apr 24 21:59:33.197989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.197952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" event={"ID":"26fbe127-e1af-4d43-a766-a61d05cee245","Type":"ContainerStarted","Data":"0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73"} Apr 24 21:59:33.198178 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.197995 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" event={"ID":"26fbe127-e1af-4d43-a766-a61d05cee245","Type":"ContainerStarted","Data":"f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f"} Apr 24 21:59:33.198178 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.198012 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" event={"ID":"26fbe127-e1af-4d43-a766-a61d05cee245","Type":"ContainerStarted","Data":"be23aeb14dccf48c618f6ca8c745cfc8d7b5af622235491955d23a21ede247a5"} Apr 24 21:59:33.198318 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.198176 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:33.198318 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.198199 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:33.199685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.199643 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 21:59:33.199992 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.199696 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" event={"ID":"19adae02-71e4-493f-b151-31853b1d445d","Type":"ContainerStarted","Data":"c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa"} Apr 24 21:59:33.199992 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.199729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" event={"ID":"19adae02-71e4-493f-b151-31853b1d445d","Type":"ContainerStarted","Data":"d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b"} Apr 24 21:59:33.199992 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.199743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" event={"ID":"19adae02-71e4-493f-b151-31853b1d445d","Type":"ContainerStarted","Data":"70a9cd6d23943226fa174db053801156b1d7ef58281503c664981be8ceb68d60"} Apr 24 21:59:33.199992 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.199840 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:33.199992 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.199860 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:33.201032 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.201008 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 21:59:33.219084 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.219037 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podStartSLOduration=2.219022266 podStartE2EDuration="2.219022266s" podCreationTimestamp="2026-04-24 21:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:59:33.21737326 +0000 UTC m=+1958.497221039" watchObservedRunningTime="2026-04-24 21:59:33.219022266 +0000 UTC m=+1958.498870108" Apr 24 21:59:33.239299 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:33.239255 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podStartSLOduration=2.239239301 podStartE2EDuration="2.239239301s" podCreationTimestamp="2026-04-24 21:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:59:33.2381137 +0000 UTC m=+1958.517961473" watchObservedRunningTime="2026-04-24 21:59:33.239239301 +0000 UTC m=+1958.519087080" Apr 24 21:59:34.203373 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:34.203327 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 21:59:34.203790 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:34.203462 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 21:59:34.911891 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:34.911849 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 24 21:59:34.912084 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:34.911858 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 24 21:59:35.441451 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.441425 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:59:35.562027 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.561945 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-proxy-tls\") pod \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " Apr 24 21:59:35.562027 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.561990 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdnph\" (UniqueName: \"kubernetes.io/projected/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-kube-api-access-kdnph\") pod \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " Apr 24 21:59:35.562027 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.562027 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-error-404-isvc-a4170-kube-rbac-proxy-sar-config\") pod \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\" (UID: \"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2\") " Apr 24 21:59:35.562461 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.562437 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-error-404-isvc-a4170-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-a4170-kube-rbac-proxy-sar-config") pod "8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" (UID: "8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2"). InnerVolumeSpecName "error-404-isvc-a4170-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:59:35.564117 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.564094 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-kube-api-access-kdnph" (OuterVolumeSpecName: "kube-api-access-kdnph") pod "8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" (UID: "8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2"). InnerVolumeSpecName "kube-api-access-kdnph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:59:35.564218 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.564113 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" (UID: "8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:59:35.663338 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.663307 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:59:35.663338 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.663337 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdnph\" (UniqueName: \"kubernetes.io/projected/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-kube-api-access-kdnph\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:59:35.663509 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.663353 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2-error-404-isvc-a4170-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:59:35.707492 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.707467 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:59:35.865605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.865513 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bmdj\" (UniqueName: \"kubernetes.io/projected/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-kube-api-access-5bmdj\") pod \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " Apr 24 21:59:35.865605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.865558 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-success-200-isvc-a4170-kube-rbac-proxy-sar-config\") pod \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " Apr 24 21:59:35.865605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.865601 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-proxy-tls\") pod \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\" (UID: \"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7\") " Apr 24 21:59:35.866005 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.865976 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-success-200-isvc-a4170-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-a4170-kube-rbac-proxy-sar-config") pod "8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" (UID: "8757ba4b-dc3a-4ef0-9c03-f465879a9ad7"). InnerVolumeSpecName "success-200-isvc-a4170-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:59:35.867629 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.867603 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" (UID: "8757ba4b-dc3a-4ef0-9c03-f465879a9ad7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:59:35.867740 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.867653 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-kube-api-access-5bmdj" (OuterVolumeSpecName: "kube-api-access-5bmdj") pod "8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" (UID: "8757ba4b-dc3a-4ef0-9c03-f465879a9ad7"). InnerVolumeSpecName "kube-api-access-5bmdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:59:35.966771 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.966734 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bmdj\" (UniqueName: \"kubernetes.io/projected/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-kube-api-access-5bmdj\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:59:35.966771 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.966765 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-a4170-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-success-200-isvc-a4170-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:59:35.966771 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:35.966776 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:59:36.211676 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.211591 2573 generic.go:358] "Generic (PLEG): container finished" podID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerID="56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c" exitCode=0 Apr 24 21:59:36.211828 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.211682 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" Apr 24 21:59:36.211828 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.211678 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" event={"ID":"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2","Type":"ContainerDied","Data":"56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c"} Apr 24 21:59:36.211935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.211829 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg" event={"ID":"8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2","Type":"ContainerDied","Data":"3d889748a96b551764d8eca1e518ea4c357000db2203a17a551e204e0dc8fe18"} Apr 24 21:59:36.211935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.211850 2573 scope.go:117] "RemoveContainer" containerID="e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9" Apr 24 21:59:36.213062 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.213041 2573 generic.go:358] "Generic (PLEG): container finished" podID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerID="33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d" exitCode=0 Apr 24 21:59:36.213150 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.213094 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" event={"ID":"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7","Type":"ContainerDied","Data":"33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d"} Apr 24 21:59:36.213150 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.213112 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" Apr 24 21:59:36.213150 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.213118 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv" event={"ID":"8757ba4b-dc3a-4ef0-9c03-f465879a9ad7","Type":"ContainerDied","Data":"44dc951a9ba27b6e3dd93e1d6d0f3d53f8f39b3efee221780b9bd80ec87d99ad"} Apr 24 21:59:36.222468 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.222446 2573 scope.go:117] "RemoveContainer" containerID="56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c" Apr 24 21:59:36.229961 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.229941 2573 scope.go:117] "RemoveContainer" containerID="e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9" Apr 24 21:59:36.230195 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:59:36.230176 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9\": container with ID starting with e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9 not found: ID does not exist" containerID="e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9" Apr 24 21:59:36.230268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.230202 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9"} err="failed to get container status \"e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9\": rpc error: code = NotFound desc = could not find container \"e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9\": container with ID starting with e3afaa25a93c4f7cfb858d8444256d32cc1ca82218b0352e9cb0cd310fa41cf9 not found: ID does not exist" Apr 24 21:59:36.230268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.230219 2573 scope.go:117] "RemoveContainer" containerID="56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c" Apr 24 21:59:36.230459 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:59:36.230441 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c\": container with ID starting with 56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c not found: ID does not exist" containerID="56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c" Apr 24 21:59:36.230497 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.230466 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c"} err="failed to get container status \"56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c\": rpc error: code = NotFound desc = could not find container \"56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c\": container with ID starting with 56b8c36f03106fb830a64c9cc6ba22eec36c70d13becf9260c4b45b16aea9a7c not found: ID does not exist" Apr 24 21:59:36.230497 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.230480 2573 scope.go:117] "RemoveContainer" containerID="735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92" Apr 24 21:59:36.237865 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.237843 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg"] Apr 24 21:59:36.238333 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.238316 2573 scope.go:117] "RemoveContainer" containerID="33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d" Apr 24 21:59:36.244483 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.244463 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg"] Apr 24 21:59:36.248716 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.248673 2573 scope.go:117] "RemoveContainer" containerID="735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92" Apr 24 21:59:36.249244 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:59:36.249219 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92\": container with ID starting with 735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92 not found: ID does not exist" containerID="735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92" Apr 24 21:59:36.249331 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.249259 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92"} err="failed to get container status \"735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92\": rpc error: code = NotFound desc = could not find container \"735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92\": container with ID starting with 735fe9a31a748e98c660c62ad8b6aa8f184ff2bfca39fee577eb6ca6b4651d92 not found: ID does not exist" Apr 24 21:59:36.249331 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.249278 2573 scope.go:117] "RemoveContainer" containerID="33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d" Apr 24 21:59:36.249515 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:59:36.249500 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d\": container with ID starting with 33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d not found: ID does not exist" containerID="33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d" Apr 24 21:59:36.249576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.249520 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d"} err="failed to get container status \"33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d\": rpc error: code = NotFound desc = could not find container \"33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d\": container with ID starting with 33e3412433aa2ab27e99f644a9cea773df71dd7b817d2aad753a5bba5808293d not found: ID does not exist" Apr 24 21:59:36.255462 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.255439 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv"] Apr 24 21:59:36.259019 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:36.258998 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv"] Apr 24 21:59:37.368604 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:37.368570 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" path="/var/lib/kubelet/pods/8757ba4b-dc3a-4ef0-9c03-f465879a9ad7/volumes" Apr 24 21:59:37.369031 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:37.369016 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" path="/var/lib/kubelet/pods/8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2/volumes" Apr 24 21:59:39.043083 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:39.043052 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 21:59:39.207911 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:39.207884 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 21:59:39.208385 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:39.208342 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 21:59:39.208563 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:39.208544 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 21:59:39.209044 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:39.209017 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 21:59:40.051163 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:40.051130 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 21:59:49.208605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:49.208518 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 21:59:49.208983 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:49.208936 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 21:59:59.209008 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:59.208968 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 21:59:59.209399 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:59:59.208969 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:00:09.209167 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:00:09.209125 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:00:09.209762 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:00:09.209130 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:00:19.209086 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:00:19.209053 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 22:00:19.209801 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:00:19.209778 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 22:01:55.405625 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:01:55.405593 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 22:01:55.408516 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:01:55.408490 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 22:06:55.430772 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:55.430742 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 22:06:55.435811 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:55.435792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 22:08:46.354826 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:46.354728 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv"] Apr 24 22:08:46.355461 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:46.355183 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" containerID="cri-o://f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f" gracePeriod=30 Apr 24 22:08:46.355461 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:46.355219 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kube-rbac-proxy" containerID="cri-o://0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73" gracePeriod=30 Apr 24 22:08:46.408185 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:46.408149 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj"] Apr 24 22:08:46.408443 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:46.408417 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" containerID="cri-o://d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b" gracePeriod=30 Apr 24 22:08:46.408535 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:46.408452 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kube-rbac-proxy" containerID="cri-o://c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa" gracePeriod=30 Apr 24 22:08:47.116146 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:47.116110 2573 generic.go:358] "Generic (PLEG): container finished" podID="19adae02-71e4-493f-b151-31853b1d445d" containerID="c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa" exitCode=2 Apr 24 22:08:47.116340 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:47.116180 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" event={"ID":"19adae02-71e4-493f-b151-31853b1d445d","Type":"ContainerDied","Data":"c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa"} Apr 24 22:08:47.117497 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:47.117476 2573 generic.go:358] "Generic (PLEG): container finished" podID="26fbe127-e1af-4d43-a766-a61d05cee245" containerID="0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73" exitCode=2 Apr 24 22:08:47.117603 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:47.117537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" event={"ID":"26fbe127-e1af-4d43-a766-a61d05cee245","Type":"ContainerDied","Data":"0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73"} Apr 24 22:08:49.203808 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.203764 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 24 22:08:49.204241 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.203778 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.55:8643/healthz\": dial tcp 10.134.0.55:8643: connect: connection refused" Apr 24 22:08:49.209134 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.209106 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:08:49.209196 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.209120 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 22:08:49.604773 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.604750 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 22:08:49.666745 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.666723 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 22:08:49.703335 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.703304 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84n6q\" (UniqueName: \"kubernetes.io/projected/26fbe127-e1af-4d43-a766-a61d05cee245-kube-api-access-84n6q\") pod \"26fbe127-e1af-4d43-a766-a61d05cee245\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " Apr 24 22:08:49.703502 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.703370 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26fbe127-e1af-4d43-a766-a61d05cee245-proxy-tls\") pod \"26fbe127-e1af-4d43-a766-a61d05cee245\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " Apr 24 22:08:49.703502 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.703402 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26fbe127-e1af-4d43-a766-a61d05cee245-success-200-isvc-600f1-kube-rbac-proxy-sar-config\") pod \"26fbe127-e1af-4d43-a766-a61d05cee245\" (UID: \"26fbe127-e1af-4d43-a766-a61d05cee245\") " Apr 24 22:08:49.703859 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.703831 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26fbe127-e1af-4d43-a766-a61d05cee245-success-200-isvc-600f1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-600f1-kube-rbac-proxy-sar-config") pod "26fbe127-e1af-4d43-a766-a61d05cee245" (UID: "26fbe127-e1af-4d43-a766-a61d05cee245"). InnerVolumeSpecName "success-200-isvc-600f1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:08:49.705441 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.705414 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26fbe127-e1af-4d43-a766-a61d05cee245-kube-api-access-84n6q" (OuterVolumeSpecName: "kube-api-access-84n6q") pod "26fbe127-e1af-4d43-a766-a61d05cee245" (UID: "26fbe127-e1af-4d43-a766-a61d05cee245"). InnerVolumeSpecName "kube-api-access-84n6q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:08:49.705441 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.705415 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26fbe127-e1af-4d43-a766-a61d05cee245-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "26fbe127-e1af-4d43-a766-a61d05cee245" (UID: "26fbe127-e1af-4d43-a766-a61d05cee245"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:08:49.804659 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.804623 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19adae02-71e4-493f-b151-31853b1d445d-error-404-isvc-600f1-kube-rbac-proxy-sar-config\") pod \"19adae02-71e4-493f-b151-31853b1d445d\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " Apr 24 22:08:49.804822 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.804682 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncslc\" (UniqueName: \"kubernetes.io/projected/19adae02-71e4-493f-b151-31853b1d445d-kube-api-access-ncslc\") pod \"19adae02-71e4-493f-b151-31853b1d445d\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " Apr 24 22:08:49.804897 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.804877 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19adae02-71e4-493f-b151-31853b1d445d-proxy-tls\") pod \"19adae02-71e4-493f-b151-31853b1d445d\" (UID: \"19adae02-71e4-493f-b151-31853b1d445d\") " Apr 24 22:08:49.805033 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.805008 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19adae02-71e4-493f-b151-31853b1d445d-error-404-isvc-600f1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-600f1-kube-rbac-proxy-sar-config") pod "19adae02-71e4-493f-b151-31853b1d445d" (UID: "19adae02-71e4-493f-b151-31853b1d445d"). InnerVolumeSpecName "error-404-isvc-600f1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:08:49.805188 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.805171 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19adae02-71e4-493f-b151-31853b1d445d-error-404-isvc-600f1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:08:49.805256 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.805190 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-84n6q\" (UniqueName: \"kubernetes.io/projected/26fbe127-e1af-4d43-a766-a61d05cee245-kube-api-access-84n6q\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:08:49.805256 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.805201 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26fbe127-e1af-4d43-a766-a61d05cee245-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:08:49.805256 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.805211 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-600f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26fbe127-e1af-4d43-a766-a61d05cee245-success-200-isvc-600f1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:08:49.806710 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.806687 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19adae02-71e4-493f-b151-31853b1d445d-kube-api-access-ncslc" (OuterVolumeSpecName: "kube-api-access-ncslc") pod "19adae02-71e4-493f-b151-31853b1d445d" (UID: "19adae02-71e4-493f-b151-31853b1d445d"). InnerVolumeSpecName "kube-api-access-ncslc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:08:49.806769 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.806750 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19adae02-71e4-493f-b151-31853b1d445d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "19adae02-71e4-493f-b151-31853b1d445d" (UID: "19adae02-71e4-493f-b151-31853b1d445d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:08:49.905864 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.905818 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19adae02-71e4-493f-b151-31853b1d445d-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:08:49.905864 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:49.905859 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ncslc\" (UniqueName: \"kubernetes.io/projected/19adae02-71e4-493f-b151-31853b1d445d-kube-api-access-ncslc\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:08:50.129452 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.129359 2573 generic.go:358] "Generic (PLEG): container finished" podID="19adae02-71e4-493f-b151-31853b1d445d" containerID="d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b" exitCode=0 Apr 24 22:08:50.129452 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.129438 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" event={"ID":"19adae02-71e4-493f-b151-31853b1d445d","Type":"ContainerDied","Data":"d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b"} Apr 24 22:08:50.129663 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.129467 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" Apr 24 22:08:50.129663 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.129480 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj" event={"ID":"19adae02-71e4-493f-b151-31853b1d445d","Type":"ContainerDied","Data":"70a9cd6d23943226fa174db053801156b1d7ef58281503c664981be8ceb68d60"} Apr 24 22:08:50.129663 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.129498 2573 scope.go:117] "RemoveContainer" containerID="c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa" Apr 24 22:08:50.131116 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.131095 2573 generic.go:358] "Generic (PLEG): container finished" podID="26fbe127-e1af-4d43-a766-a61d05cee245" containerID="f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f" exitCode=0 Apr 24 22:08:50.131270 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.131178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" event={"ID":"26fbe127-e1af-4d43-a766-a61d05cee245","Type":"ContainerDied","Data":"f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f"} Apr 24 22:08:50.131270 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.131195 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" Apr 24 22:08:50.131270 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.131211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv" event={"ID":"26fbe127-e1af-4d43-a766-a61d05cee245","Type":"ContainerDied","Data":"be23aeb14dccf48c618f6ca8c745cfc8d7b5af622235491955d23a21ede247a5"} Apr 24 22:08:50.139069 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.139051 2573 scope.go:117] "RemoveContainer" containerID="d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b" Apr 24 22:08:50.146998 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.146978 2573 scope.go:117] "RemoveContainer" containerID="c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa" Apr 24 22:08:50.147255 ip-10-0-136-201 kubenswrapper[2573]: E0424 22:08:50.147238 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa\": container with ID starting with c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa not found: ID does not exist" containerID="c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa" Apr 24 22:08:50.147310 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.147263 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa"} err="failed to get container status \"c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa\": rpc error: code = NotFound desc = could not find container \"c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa\": container with ID starting with c8f95035b9cfc8b04916cef5ddf486f4b7f2b5219b13c1b078ae37507927cbaa not found: ID does not exist" Apr 24 22:08:50.147310 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.147279 2573 scope.go:117] "RemoveContainer" containerID="d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b" Apr 24 22:08:50.147498 ip-10-0-136-201 kubenswrapper[2573]: E0424 22:08:50.147479 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b\": container with ID starting with d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b not found: ID does not exist" containerID="d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b" Apr 24 22:08:50.147532 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.147505 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b"} err="failed to get container status \"d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b\": rpc error: code = NotFound desc = could not find container \"d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b\": container with ID starting with d30efb8f481946360f1c7953e2382cca106b541338a88605d4e99c0bb448879b not found: ID does not exist" Apr 24 22:08:50.147532 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.147520 2573 scope.go:117] "RemoveContainer" containerID="0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73" Apr 24 22:08:50.156211 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.156188 2573 scope.go:117] "RemoveContainer" containerID="f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f" Apr 24 22:08:50.156639 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.156625 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv"] Apr 24 22:08:50.158504 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.158486 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv"] Apr 24 22:08:50.163559 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.163541 2573 scope.go:117] "RemoveContainer" containerID="0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73" Apr 24 22:08:50.163819 ip-10-0-136-201 kubenswrapper[2573]: E0424 22:08:50.163801 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73\": container with ID starting with 0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73 not found: ID does not exist" containerID="0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73" Apr 24 22:08:50.163872 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.163834 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73"} err="failed to get container status \"0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73\": rpc error: code = NotFound desc = could not find container \"0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73\": container with ID starting with 0a5a3901c954b050ffcc231973410868e285982ef60d5aa5470c2848bd8dee73 not found: ID does not exist" Apr 24 22:08:50.163872 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.163853 2573 scope.go:117] "RemoveContainer" containerID="f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f" Apr 24 22:08:50.164117 ip-10-0-136-201 kubenswrapper[2573]: E0424 22:08:50.164104 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f\": container with ID starting with f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f not found: ID does not exist" containerID="f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f" Apr 24 22:08:50.164161 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.164121 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f"} err="failed to get container status \"f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f\": rpc error: code = NotFound desc = could not find container \"f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f\": container with ID starting with f1a48a17f634aed8fb3c98708160c704e2555878f4c7e8cb31f7663cd85b672f not found: ID does not exist" Apr 24 22:08:50.168874 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.168841 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj"] Apr 24 22:08:50.172963 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:50.172944 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj"] Apr 24 22:08:51.368668 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:51.368638 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19adae02-71e4-493f-b151-31853b1d445d" path="/var/lib/kubelet/pods/19adae02-71e4-493f-b151-31853b1d445d/volumes" Apr 24 22:08:51.369093 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:08:51.369080 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" path="/var/lib/kubelet/pods/26fbe127-e1af-4d43-a766-a61d05cee245/volumes" Apr 24 22:11:55.454889 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:11:55.454776 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 22:11:55.461074 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:11:55.461050 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 22:16:10.673778 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:10.673745 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9"] Apr 24 22:16:10.674302 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:10.674055 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" containerID="cri-o://f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8" gracePeriod=30 Apr 24 22:16:10.674302 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:10.674063 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kube-rbac-proxy" containerID="cri-o://2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b" gracePeriod=30 Apr 24 22:16:10.733761 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:10.733727 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6"] Apr 24 22:16:10.734123 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:10.734092 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" containerID="cri-o://0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9" gracePeriod=30 Apr 24 22:16:10.734358 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:10.734288 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kube-rbac-proxy" containerID="cri-o://9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3" gracePeriod=30 Apr 24 22:16:11.605214 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:11.605176 2573 generic.go:358] "Generic (PLEG): container finished" podID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerID="2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b" exitCode=2 Apr 24 22:16:11.605402 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:11.605243 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" event={"ID":"ab9fee25-cf0b-4243-9b8d-702558af3735","Type":"ContainerDied","Data":"2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b"} Apr 24 22:16:11.606755 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:11.606732 2573 generic.go:358] "Generic (PLEG): container finished" podID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerID="9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3" exitCode=2 Apr 24 22:16:11.606876 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:11.606804 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" event={"ID":"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f","Type":"ContainerDied","Data":"9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3"} Apr 24 22:16:13.895113 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:13.895094 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 22:16:13.999068 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:13.999040 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-proxy-tls\") pod \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " Apr 24 22:16:13.999248 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:13.999102 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-error-404-isvc-f4deb-kube-rbac-proxy-sar-config\") pod \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " Apr 24 22:16:13.999248 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:13.999209 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnrrz\" (UniqueName: \"kubernetes.io/projected/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-kube-api-access-fnrrz\") pod \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\" (UID: \"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f\") " Apr 24 22:16:13.999436 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:13.999410 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-error-404-isvc-f4deb-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-f4deb-kube-rbac-proxy-sar-config") pod "6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" (UID: "6ad1d5e7-a76a-4378-b25d-eaeeea0f959f"). InnerVolumeSpecName "error-404-isvc-f4deb-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:16:14.001251 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.001226 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" (UID: "6ad1d5e7-a76a-4378-b25d-eaeeea0f959f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:16:14.001454 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.001433 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-kube-api-access-fnrrz" (OuterVolumeSpecName: "kube-api-access-fnrrz") pod "6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" (UID: "6ad1d5e7-a76a-4378-b25d-eaeeea0f959f"). InnerVolumeSpecName "kube-api-access-fnrrz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:14.010362 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.010343 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 22:16:14.100686 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.100652 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab9fee25-cf0b-4243-9b8d-702558af3735-proxy-tls\") pod \"ab9fee25-cf0b-4243-9b8d-702558af3735\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " Apr 24 22:16:14.100845 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.100700 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab9fee25-cf0b-4243-9b8d-702558af3735-success-200-isvc-f4deb-kube-rbac-proxy-sar-config\") pod \"ab9fee25-cf0b-4243-9b8d-702558af3735\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " Apr 24 22:16:14.100845 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.100736 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr5m7\" (UniqueName: \"kubernetes.io/projected/ab9fee25-cf0b-4243-9b8d-702558af3735-kube-api-access-vr5m7\") pod \"ab9fee25-cf0b-4243-9b8d-702558af3735\" (UID: \"ab9fee25-cf0b-4243-9b8d-702558af3735\") " Apr 24 22:16:14.100952 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.100898 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:16:14.100952 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.100935 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-error-404-isvc-f4deb-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:16:14.101040 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.100953 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fnrrz\" (UniqueName: \"kubernetes.io/projected/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f-kube-api-access-fnrrz\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:16:14.101107 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.101084 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab9fee25-cf0b-4243-9b8d-702558af3735-success-200-isvc-f4deb-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-f4deb-kube-rbac-proxy-sar-config") pod "ab9fee25-cf0b-4243-9b8d-702558af3735" (UID: "ab9fee25-cf0b-4243-9b8d-702558af3735"). InnerVolumeSpecName "success-200-isvc-f4deb-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:16:14.102711 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.102692 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9fee25-cf0b-4243-9b8d-702558af3735-kube-api-access-vr5m7" (OuterVolumeSpecName: "kube-api-access-vr5m7") pod "ab9fee25-cf0b-4243-9b8d-702558af3735" (UID: "ab9fee25-cf0b-4243-9b8d-702558af3735"). InnerVolumeSpecName "kube-api-access-vr5m7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:14.102762 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.102720 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9fee25-cf0b-4243-9b8d-702558af3735-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ab9fee25-cf0b-4243-9b8d-702558af3735" (UID: "ab9fee25-cf0b-4243-9b8d-702558af3735"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:16:14.202136 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.202066 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab9fee25-cf0b-4243-9b8d-702558af3735-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:16:14.202136 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.202089 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-f4deb-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab9fee25-cf0b-4243-9b8d-702558af3735-success-200-isvc-f4deb-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:16:14.202136 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.202100 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vr5m7\" (UniqueName: \"kubernetes.io/projected/ab9fee25-cf0b-4243-9b8d-702558af3735-kube-api-access-vr5m7\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:16:14.617975 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.617937 2573 generic.go:358] "Generic (PLEG): container finished" podID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerID="f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8" exitCode=0 Apr 24 22:16:14.618182 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.618016 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" Apr 24 22:16:14.618182 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.618012 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" event={"ID":"ab9fee25-cf0b-4243-9b8d-702558af3735","Type":"ContainerDied","Data":"f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8"} Apr 24 22:16:14.618182 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.618138 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9" event={"ID":"ab9fee25-cf0b-4243-9b8d-702558af3735","Type":"ContainerDied","Data":"7af88b818194d6f92acf677d2faffc4f9e311980426cf2ca427e6bae49490e3f"} Apr 24 22:16:14.618182 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.618163 2573 scope.go:117] "RemoveContainer" containerID="2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b" Apr 24 22:16:14.619490 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.619470 2573 generic.go:358] "Generic (PLEG): container finished" podID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerID="0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9" exitCode=0 Apr 24 22:16:14.619627 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.619564 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" Apr 24 22:16:14.619627 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.619562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" event={"ID":"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f","Type":"ContainerDied","Data":"0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9"} Apr 24 22:16:14.619753 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.619622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6" event={"ID":"6ad1d5e7-a76a-4378-b25d-eaeeea0f959f","Type":"ContainerDied","Data":"3093ce63d0f35fd09e61a70fb3f19b3f54f3360dd4d291c90ab0a03e5f152b7f"} Apr 24 22:16:14.627897 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.627881 2573 scope.go:117] "RemoveContainer" containerID="f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8" Apr 24 22:16:14.635469 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.635451 2573 scope.go:117] "RemoveContainer" containerID="2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b" Apr 24 22:16:14.635697 ip-10-0-136-201 kubenswrapper[2573]: E0424 22:16:14.635679 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b\": container with ID starting with 2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b not found: ID does not exist" containerID="2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b" Apr 24 22:16:14.635743 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.635706 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b"} err="failed to get container status \"2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b\": rpc error: code = NotFound desc = could not find container \"2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b\": container with ID starting with 2693487e68154552d22d501c7c49370df324f3031d747d4ae3dbe7e292b7160b not found: ID does not exist" Apr 24 22:16:14.635743 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.635722 2573 scope.go:117] "RemoveContainer" containerID="f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8" Apr 24 22:16:14.635944 ip-10-0-136-201 kubenswrapper[2573]: E0424 22:16:14.635910 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8\": container with ID starting with f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8 not found: ID does not exist" containerID="f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8" Apr 24 22:16:14.635986 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.635951 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8"} err="failed to get container status \"f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8\": rpc error: code = NotFound desc = could not find container \"f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8\": container with ID starting with f3f33062c394d50281f719bcc37a04e2aa88e69111adc7f39b20e06904df4da8 not found: ID does not exist" Apr 24 22:16:14.635986 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.635966 2573 scope.go:117] "RemoveContainer" containerID="9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3" Apr 24 22:16:14.644890 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.644870 2573 scope.go:117] "RemoveContainer" containerID="0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9" Apr 24 22:16:14.651086 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.651067 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6"] Apr 24 22:16:14.652411 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.652376 2573 scope.go:117] "RemoveContainer" containerID="9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3" Apr 24 22:16:14.652753 ip-10-0-136-201 kubenswrapper[2573]: E0424 22:16:14.652665 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3\": container with ID starting with 9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3 not found: ID does not exist" containerID="9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3" Apr 24 22:16:14.652753 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.652701 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3"} err="failed to get container status \"9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3\": rpc error: code = NotFound desc = could not find container \"9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3\": container with ID starting with 9a7eaef92374d1429eaeb12de8789e205d4cb797f5134cac8ca7ffdd4dac92d3 not found: ID does not exist" Apr 24 22:16:14.652753 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.652726 2573 scope.go:117] "RemoveContainer" containerID="0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9" Apr 24 22:16:14.653057 ip-10-0-136-201 kubenswrapper[2573]: E0424 22:16:14.653024 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9\": container with ID starting with 0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9 not found: ID does not exist" containerID="0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9" Apr 24 22:16:14.653122 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.653054 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9"} err="failed to get container status \"0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9\": rpc error: code = NotFound desc = could not find container \"0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9\": container with ID starting with 0ee39dd3382c85c86328e2ba440790eb7dbb71b270bf0b0b07d2e8616e6c1ca9 not found: ID does not exist" Apr 24 22:16:14.655287 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.655265 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6"] Apr 24 22:16:14.664315 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.664297 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9"] Apr 24 22:16:14.670050 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:14.670031 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9"] Apr 24 22:16:15.367688 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:15.367652 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" path="/var/lib/kubelet/pods/6ad1d5e7-a76a-4378-b25d-eaeeea0f959f/volumes" Apr 24 22:16:15.368124 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:15.368109 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" path="/var/lib/kubelet/pods/ab9fee25-cf0b-4243-9b8d-702558af3735/volumes" Apr 24 22:16:35.987666 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.987622 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7csvq/must-gather-rvz2s"] Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988126 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988155 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988166 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kube-rbac-proxy" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988174 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kube-rbac-proxy" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988190 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kube-rbac-proxy" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988199 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kube-rbac-proxy" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988212 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988221 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988237 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988251 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988261 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988268 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988277 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kube-rbac-proxy" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988285 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kube-rbac-proxy" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988297 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988311 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988332 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988339 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" Apr 24 22:16:35.988349 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988355 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988363 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988378 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988387 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988394 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988402 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988504 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988519 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kserve-container" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988529 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988540 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988553 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kserve-container" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988564 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="26fbe127-e1af-4d43-a766-a61d05cee245" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988573 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kserve-container" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988584 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cfe76c8-2cae-4af0-baca-9c7e7c62c3a2" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988593 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8757ba4b-dc3a-4ef0-9c03-f465879a9ad7" containerName="kserve-container" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988603 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ad1d5e7-a76a-4378-b25d-eaeeea0f959f" containerName="kube-rbac-proxy" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988613 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab9fee25-cf0b-4243-9b8d-702558af3735" containerName="kserve-container" Apr 24 22:16:35.989313 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.988624 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="19adae02-71e4-493f-b151-31853b1d445d" containerName="kserve-container" Apr 24 22:16:35.992103 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.992082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7csvq/must-gather-rvz2s" Apr 24 22:16:35.994454 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.994432 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7csvq\"/\"kube-root-ca.crt\"" Apr 24 22:16:35.995330 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.995291 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7csvq\"/\"openshift-service-ca.crt\"" Apr 24 22:16:35.995330 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.995289 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7csvq\"/\"default-dockercfg-vv79p\"" Apr 24 22:16:35.998712 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:35.998688 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7csvq/must-gather-rvz2s"] Apr 24 22:16:36.082574 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.082541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d283f738-323f-4e5e-b369-b6ac373e4702-must-gather-output\") pod \"must-gather-rvz2s\" (UID: \"d283f738-323f-4e5e-b369-b6ac373e4702\") " pod="openshift-must-gather-7csvq/must-gather-rvz2s" Apr 24 22:16:36.082728 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.082681 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xx7m\" (UniqueName: \"kubernetes.io/projected/d283f738-323f-4e5e-b369-b6ac373e4702-kube-api-access-5xx7m\") pod \"must-gather-rvz2s\" (UID: \"d283f738-323f-4e5e-b369-b6ac373e4702\") " pod="openshift-must-gather-7csvq/must-gather-rvz2s" Apr 24 22:16:36.183189 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.183146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xx7m\" (UniqueName: \"kubernetes.io/projected/d283f738-323f-4e5e-b369-b6ac373e4702-kube-api-access-5xx7m\") pod \"must-gather-rvz2s\" (UID: \"d283f738-323f-4e5e-b369-b6ac373e4702\") " pod="openshift-must-gather-7csvq/must-gather-rvz2s" Apr 24 22:16:36.183189 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.183194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d283f738-323f-4e5e-b369-b6ac373e4702-must-gather-output\") pod \"must-gather-rvz2s\" (UID: \"d283f738-323f-4e5e-b369-b6ac373e4702\") " pod="openshift-must-gather-7csvq/must-gather-rvz2s" Apr 24 22:16:36.183522 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.183503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d283f738-323f-4e5e-b369-b6ac373e4702-must-gather-output\") pod \"must-gather-rvz2s\" (UID: \"d283f738-323f-4e5e-b369-b6ac373e4702\") " pod="openshift-must-gather-7csvq/must-gather-rvz2s" Apr 24 22:16:36.191072 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.191044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xx7m\" (UniqueName: \"kubernetes.io/projected/d283f738-323f-4e5e-b369-b6ac373e4702-kube-api-access-5xx7m\") pod \"must-gather-rvz2s\" (UID: \"d283f738-323f-4e5e-b369-b6ac373e4702\") " pod="openshift-must-gather-7csvq/must-gather-rvz2s" Apr 24 22:16:36.314593 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.314557 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7csvq/must-gather-rvz2s" Apr 24 22:16:36.434758 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.434732 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7csvq/must-gather-rvz2s"] Apr 24 22:16:36.436706 ip-10-0-136-201 kubenswrapper[2573]: W0424 22:16:36.436676 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd283f738_323f_4e5e_b369_b6ac373e4702.slice/crio-c517e54185ac63c982a55d3276c291ecc17da3342fd4ed167dc592164c27b09b WatchSource:0}: Error finding container c517e54185ac63c982a55d3276c291ecc17da3342fd4ed167dc592164c27b09b: Status 404 returned error can't find the container with id c517e54185ac63c982a55d3276c291ecc17da3342fd4ed167dc592164c27b09b Apr 24 22:16:36.438553 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.438537 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:16:36.701584 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:36.701497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7csvq/must-gather-rvz2s" event={"ID":"d283f738-323f-4e5e-b369-b6ac373e4702","Type":"ContainerStarted","Data":"c517e54185ac63c982a55d3276c291ecc17da3342fd4ed167dc592164c27b09b"} Apr 24 22:16:37.708598 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:37.708385 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7csvq/must-gather-rvz2s" event={"ID":"d283f738-323f-4e5e-b369-b6ac373e4702","Type":"ContainerStarted","Data":"3dd47334ddbdf19cdfbfc8d0f47c3b289ba4829283ca4c645a14ac8a3f3f2cd9"} Apr 24 22:16:37.708598 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:37.708436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7csvq/must-gather-rvz2s" event={"ID":"d283f738-323f-4e5e-b369-b6ac373e4702","Type":"ContainerStarted","Data":"e2c46ca88b220d26638ec7ef44d7033518991cab54dace87050b7980f99287c7"} Apr 24 22:16:37.726290 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:37.726186 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7csvq/must-gather-rvz2s" podStartSLOduration=1.9572605809999999 podStartE2EDuration="2.726166827s" podCreationTimestamp="2026-04-24 22:16:35 +0000 UTC" firstStartedPulling="2026-04-24 22:16:36.438686153 +0000 UTC m=+2981.718533911" lastFinishedPulling="2026-04-24 22:16:37.207592398 +0000 UTC m=+2982.487440157" observedRunningTime="2026-04-24 22:16:37.723351099 +0000 UTC m=+2983.003198874" watchObservedRunningTime="2026-04-24 22:16:37.726166827 +0000 UTC m=+2983.006014612" Apr 24 22:16:38.644418 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:38.644361 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nk9jw_b7e233db-788d-485b-815f-9cd371ff230e/global-pull-secret-syncer/0.log" Apr 24 22:16:38.722587 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:38.722557 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-589nt_cc88058c-aa01-4dec-b649-e759dc3c5b91/konnectivity-agent/0.log" Apr 24 22:16:38.881903 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:38.881865 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-201.ec2.internal_bcf950b6b9eef658d8ab20236281bc71/haproxy/0.log" Apr 24 22:16:41.858064 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:41.858029 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b8f4f9a6-088c-4e63-bab7-672714f996a1/alertmanager/0.log" Apr 24 22:16:41.888659 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:41.888622 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b8f4f9a6-088c-4e63-bab7-672714f996a1/config-reloader/0.log" Apr 24 22:16:41.917619 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:41.917590 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b8f4f9a6-088c-4e63-bab7-672714f996a1/kube-rbac-proxy-web/0.log" Apr 24 22:16:41.949194 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:41.949155 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b8f4f9a6-088c-4e63-bab7-672714f996a1/kube-rbac-proxy/0.log" Apr 24 22:16:41.979658 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:41.979632 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b8f4f9a6-088c-4e63-bab7-672714f996a1/kube-rbac-proxy-metric/0.log" Apr 24 22:16:42.004767 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.004735 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b8f4f9a6-088c-4e63-bab7-672714f996a1/prom-label-proxy/0.log" Apr 24 22:16:42.034187 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.034119 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b8f4f9a6-088c-4e63-bab7-672714f996a1/init-config-reloader/0.log" Apr 24 22:16:42.098071 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.096841 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-5fl99_6d3fdf4f-a1c2-4d88-9531-85052b8a2f90/cluster-monitoring-operator/0.log" Apr 24 22:16:42.327471 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.327441 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9t7t2_499e5975-74b1-4afc-9a86-a012675aa62d/node-exporter/0.log" Apr 24 22:16:42.351008 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.350973 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9t7t2_499e5975-74b1-4afc-9a86-a012675aa62d/kube-rbac-proxy/0.log" Apr 24 22:16:42.374147 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.374111 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9t7t2_499e5975-74b1-4afc-9a86-a012675aa62d/init-textfile/0.log" Apr 24 22:16:42.789207 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.789170 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-ztbnl_6cab315a-9130-4a94-88ff-6ef8e5291d77/prometheus-operator-admission-webhook/0.log" Apr 24 22:16:42.819193 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.819158 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-559ff94dc9-gj59f_4f2fd1f8-630e-43fa-ae70-146673d2898e/telemeter-client/0.log" Apr 24 22:16:42.843174 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.843140 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-559ff94dc9-gj59f_4f2fd1f8-630e-43fa-ae70-146673d2898e/reload/0.log" Apr 24 22:16:42.866584 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:42.866551 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-559ff94dc9-gj59f_4f2fd1f8-630e-43fa-ae70-146673d2898e/kube-rbac-proxy/0.log" Apr 24 22:16:45.028821 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.028775 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cddbf496b-qxs8r_a1a97060-50fd-41b0-abc2-a5a8a845b124/console/0.log" Apr 24 22:16:45.062566 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.062536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-jrtmd_c85fb73a-d28f-47c4-8a91-b0890eced33b/download-server/0.log" Apr 24 22:16:45.483815 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.483741 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-vdbs8_4c0b984d-cadd-403a-96c1-59b9b9d27f65/volume-data-source-validator/0.log" Apr 24 22:16:45.754596 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.754565 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9"] Apr 24 22:16:45.759306 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.759284 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.771864 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.771840 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9"] Apr 24 22:16:45.789677 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.789641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxnfp\" (UniqueName: \"kubernetes.io/projected/445179fc-8c39-427a-900a-485235f617ca-kube-api-access-nxnfp\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.789855 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.789695 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-lib-modules\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.789855 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.789768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-sys\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.789855 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.789836 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-podres\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.790030 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.789880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-proc\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.891419 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.891385 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-podres\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.891602 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.891430 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-proc\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.891602 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.891486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxnfp\" (UniqueName: \"kubernetes.io/projected/445179fc-8c39-427a-900a-485235f617ca-kube-api-access-nxnfp\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.891602 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.891509 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-lib-modules\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.891602 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.891540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-podres\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.891602 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.891551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-sys\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.891602 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.891572 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-proc\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.891602 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.891592 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-sys\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.891980 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.891664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/445179fc-8c39-427a-900a-485235f617ca-lib-modules\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:45.901036 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:45.901007 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxnfp\" (UniqueName: \"kubernetes.io/projected/445179fc-8c39-427a-900a-485235f617ca-kube-api-access-nxnfp\") pod \"perf-node-gather-daemonset-wn6f9\" (UID: \"445179fc-8c39-427a-900a-485235f617ca\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:46.072232 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.072143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:46.214393 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.214359 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9"] Apr 24 22:16:46.217623 ip-10-0-136-201 kubenswrapper[2573]: W0424 22:16:46.217586 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod445179fc_8c39_427a_900a_485235f617ca.slice/crio-8d2580eb1733bc6c5e83a3592253cee4609e89e4b884e751ddaf939c95b2ab9a WatchSource:0}: Error finding container 8d2580eb1733bc6c5e83a3592253cee4609e89e4b884e751ddaf939c95b2ab9a: Status 404 returned error can't find the container with id 8d2580eb1733bc6c5e83a3592253cee4609e89e4b884e751ddaf939c95b2ab9a Apr 24 22:16:46.240798 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.240772 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n6xn7_a8f3bdc0-c9cc-4161-9c81-77828c331c3b/dns/0.log" Apr 24 22:16:46.263303 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.263283 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n6xn7_a8f3bdc0-c9cc-4161-9c81-77828c331c3b/kube-rbac-proxy/0.log" Apr 24 22:16:46.386834 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.386760 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zvvmj_fc418569-4514-4b49-bd55-839ecdb097d5/dns-node-resolver/0.log" Apr 24 22:16:46.750731 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.750701 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" event={"ID":"445179fc-8c39-427a-900a-485235f617ca","Type":"ContainerStarted","Data":"f7d562e2b99115ec89a6f22b24222d401978037a68b6feda576af9511acafc99"} Apr 24 22:16:46.750903 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.750741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" event={"ID":"445179fc-8c39-427a-900a-485235f617ca","Type":"ContainerStarted","Data":"8d2580eb1733bc6c5e83a3592253cee4609e89e4b884e751ddaf939c95b2ab9a"} Apr 24 22:16:46.750903 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.750834 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:46.769528 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.769480 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" podStartSLOduration=1.7694661630000001 podStartE2EDuration="1.769466163s" podCreationTimestamp="2026-04-24 22:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:16:46.767085313 +0000 UTC m=+2992.046933093" watchObservedRunningTime="2026-04-24 22:16:46.769466163 +0000 UTC m=+2992.049313942" Apr 24 22:16:46.789455 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.789428 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-869bd4699f-pxgbj_24138133-faf2-4505-9acb-d85e76bf96d3/registry/0.log" Apr 24 22:16:46.836196 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:46.836169 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q9bbt_313a846d-a4f1-459e-b416-a695b875548d/node-ca/0.log" Apr 24 22:16:47.595184 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:47.595153 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69dd467946-567hg_ed5b5e5f-f500-4262-a6bb-2772c51e47b0/router/0.log" Apr 24 22:16:47.943704 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:47.943630 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6l98d_ff600116-8b92-45dd-8c1f-07b5c9151008/serve-healthcheck-canary/0.log" Apr 24 22:16:48.314471 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:48.314439 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-lm6jb_73ca8074-c925-4c71-a52a-9bdc355c56df/insights-operator/0.log" Apr 24 22:16:48.315079 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:48.315055 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-lm6jb_73ca8074-c925-4c71-a52a-9bdc355c56df/insights-operator/1.log" Apr 24 22:16:48.461612 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:48.461586 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xwrsb_511a2fb0-bcb1-4164-92c0-072aaaa01cf3/kube-rbac-proxy/0.log" Apr 24 22:16:48.481505 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:48.481476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xwrsb_511a2fb0-bcb1-4164-92c0-072aaaa01cf3/exporter/0.log" Apr 24 22:16:48.501605 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:48.501578 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xwrsb_511a2fb0-bcb1-4164-92c0-072aaaa01cf3/extractor/0.log" Apr 24 22:16:50.449514 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:50.449478 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84b6647887-nxnwd_38ce435e-8954-419b-90ed-d616f45f2f59/manager/0.log" Apr 24 22:16:50.491616 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:50.491587 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-f8dp4_b042dc72-bdf2-4b0a-9f5a-c15cd5f6f1f5/server/0.log" Apr 24 22:16:50.775404 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:50.775372 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-dtvqv_d5f7c2e0-03d7-4431-9edd-91dc7f3bf016/manager/0.log" Apr 24 22:16:50.824818 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:50.824789 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-48bcc_43e51c33-1433-4f75-95b7-aac90eba0279/seaweedfs/0.log" Apr 24 22:16:52.766777 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:52.766440 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-wn6f9" Apr 24 22:16:55.019690 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:55.018889 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-7twhd_a11ab57b-145c-4043-bbce-507e3d1017ec/kube-storage-version-migrator-operator/1.log" Apr 24 22:16:55.020194 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:55.020017 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-7twhd_a11ab57b-145c-4043-bbce-507e3d1017ec/kube-storage-version-migrator-operator/0.log" Apr 24 22:16:55.495893 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:55.494776 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 22:16:55.515042 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:55.513868 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 22:16:56.012227 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.012195 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9zk7d_fe0039f5-a4d2-42e2-86f7-be764dcf37fd/kube-multus/0.log" Apr 24 22:16:56.036053 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.036028 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjtjf_c8bc6b6e-12b3-4cd5-83f8-09faef4eb787/kube-multus-additional-cni-plugins/0.log" Apr 24 22:16:56.057655 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.057627 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjtjf_c8bc6b6e-12b3-4cd5-83f8-09faef4eb787/egress-router-binary-copy/0.log" Apr 24 22:16:56.078912 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.078887 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjtjf_c8bc6b6e-12b3-4cd5-83f8-09faef4eb787/cni-plugins/0.log" Apr 24 22:16:56.099732 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.099708 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjtjf_c8bc6b6e-12b3-4cd5-83f8-09faef4eb787/bond-cni-plugin/0.log" Apr 24 22:16:56.119931 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.119895 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjtjf_c8bc6b6e-12b3-4cd5-83f8-09faef4eb787/routeoverride-cni/0.log" Apr 24 22:16:56.140529 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.140505 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjtjf_c8bc6b6e-12b3-4cd5-83f8-09faef4eb787/whereabouts-cni-bincopy/0.log" Apr 24 22:16:56.165657 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.165623 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cjtjf_c8bc6b6e-12b3-4cd5-83f8-09faef4eb787/whereabouts-cni/0.log" Apr 24 22:16:56.834700 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.834663 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jrhlr_932901de-5edd-4054-b5df-89077b36dd14/network-metrics-daemon/0.log" Apr 24 22:16:56.852930 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:56.852889 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jrhlr_932901de-5edd-4054-b5df-89077b36dd14/kube-rbac-proxy/0.log" Apr 24 22:16:58.042045 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:58.042016 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-controller/0.log" Apr 24 22:16:58.058605 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:58.058576 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/0.log" Apr 24 22:16:58.072501 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:58.072473 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovn-acl-logging/1.log" Apr 24 22:16:58.094735 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:58.094707 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/kube-rbac-proxy-node/0.log" Apr 24 22:16:58.117309 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:58.117276 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:16:58.133867 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:58.133846 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/northd/0.log" Apr 24 22:16:58.155413 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:58.155388 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/nbdb/0.log" Apr 24 22:16:58.185485 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:58.185430 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/sbdb/0.log" Apr 24 22:16:58.305470 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:58.305394 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz6wr_58e1e5f8-ee7a-4e0f-87fc-b18349e28725/ovnkube-controller/0.log" Apr 24 22:16:59.726233 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:16:59.726204 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lkml2_8b12b6f6-d28f-4cda-9380-4efcca507494/network-check-target-container/0.log" Apr 24 22:17:00.729149 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:17:00.729120 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-w7rcc_2d353477-699a-4613-82c7-27ddf9ec3b73/iptables-alerter/0.log" Apr 24 22:17:01.376352 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:17:01.376319 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-djtbw_57fbfaf0-c0df-4a08-9387-11b04cf5ba29/tuned/0.log"