Apr 16 13:56:46.014965 ip-10-0-140-244 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:46.014979 ip-10-0-140-244 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:46.014989 ip-10-0-140-244 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:46.015335 ip-10-0-140-244 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:56:57.220839 ip-10-0-140-244 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:56:57.220859 ip-10-0-140-244 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a618d276985e4080b90a07c98c8ec8e3 -- Apr 16 13:59:23.059673 ip-10-0-140-244 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:23.531652 ip-10-0-140-244 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:23.531652 ip-10-0-140-244 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:23.531652 ip-10-0-140-244 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:23.531652 ip-10-0-140-244 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:23.531652 ip-10-0-140-244 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:23.534871 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.534730 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:23.541768 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541745 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.541768 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541765 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.541768 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541769 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.541768 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541772 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.541768 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541775 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541779 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541782 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541785 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541787 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541790 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541793 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541797 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541799 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541802 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541805 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541808 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541811 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541814 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541817 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541819 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541822 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541824 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541827 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541832 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.541966 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541835 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541838 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541840 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541843 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541845 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541848 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541851 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541853 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541856 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541859 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541861 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541864 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541866 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541869 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541872 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541877 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541879 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541882 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541885 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541887 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.542500 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541890 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541893 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541895 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541898 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541902 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541907 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541910 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541912 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541915 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541917 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541920 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541923 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541927 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541931 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541934 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541937 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541940 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541943 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541945 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.542987 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541948 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541950 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541953 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541955 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541958 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541960 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541963 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541966 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541969 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541972 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541975 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541977 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541981 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541984 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541987 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541991 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541994 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.541997 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.542000 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.542002 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.543462 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.542005 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.542008 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.542010 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543738 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543745 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543749 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543752 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543755 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543758 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543761 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543764 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543767 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543770 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543773 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543776 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543778 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543781 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543783 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543786 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543788 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.544068 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543791 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543793 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543796 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543798 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543801 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543803 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543806 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543809 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543813 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543817 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543820 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543823 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543825 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543828 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543832 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543835 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543838 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543840 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543843 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.544571 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543846 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543848 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543851 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543854 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543857 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543860 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543862 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543865 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543867 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543870 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543872 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543875 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543877 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543880 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543882 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543885 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543888 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543890 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543893 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543895 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.545045 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543898 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543908 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543911 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543914 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543916 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543919 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543922 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543927 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543930 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543933 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543936 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543939 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543941 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543944 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543946 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543949 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543951 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543954 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543956 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.545541 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543959 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543961 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543964 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543967 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543970 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543973 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543975 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543978 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543981 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543983 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.543986 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544057 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544064 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544070 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544074 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544079 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544083 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544087 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544092 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544095 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544099 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:23.546001 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544103 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544106 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544109 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544112 2564 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544115 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544118 2564 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544121 2564 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544124 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544127 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544131 2564 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544134 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544137 2564 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544140 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544144 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544148 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544151 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544154 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544158 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544161 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544164 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544167 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544170 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544174 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544178 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544181 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:23.546545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544184 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544187 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544191 2564 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544194 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544214 2564 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544217 2564 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544220 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544223 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544227 2564 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544231 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544234 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544237 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544240 2564 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544243 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544246 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544249 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544252 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544255 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544258 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544260 2564 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544265 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544268 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544271 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544274 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544277 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:23.547145 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544280 2564 flags.go:64] FLAG: --help="false" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544283 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544287 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544290 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544293 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544296 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544300 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544303 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544305 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544308 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544312 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544315 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544319 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544335 2564 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544339 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544342 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544346 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544349 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544352 2564 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544355 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544358 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544361 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544366 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:23.547766 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544369 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544372 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544375 2564 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544378 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544381 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544385 2564 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544388 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544392 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544396 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544400 2564 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544403 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544406 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544409 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544412 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544415 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544418 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544421 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544429 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544432 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544435 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544438 2564 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544442 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544447 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544450 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:23.548408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544453 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544457 2564 flags.go:64] FLAG: --port="10250" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544460 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544462 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c0d5d6c3a7f8ca08" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544466 2564 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544468 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544471 2564 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544474 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544477 2564 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544481 2564 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544483 2564 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544486 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544489 2564 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544493 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544496 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544504 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544507 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544510 2564 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544513 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544516 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544519 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544522 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544524 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544527 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544530 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544534 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:23.549022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544537 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544540 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544542 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544546 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544549 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544552 2564 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544555 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544560 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544563 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544565 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544571 2564 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544574 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544577 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544581 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544584 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544587 2564 flags.go:64] FLAG: --v="2" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544592 2564 flags.go:64] FLAG: --version="false" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544595 2564 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544600 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544604 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544708 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544714 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544717 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544720 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.549662 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544723 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544726 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544729 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544732 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544735 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544737 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544740 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544743 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544745 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544748 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544750 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544753 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544756 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544758 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544761 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544764 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544766 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544769 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544772 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544774 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.550277 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544777 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544779 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544782 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544785 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544788 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544790 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544794 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544798 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544801 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544805 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544808 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544811 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544814 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544817 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544819 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544822 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544824 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544827 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544830 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544832 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.550782 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544835 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544837 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544839 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544842 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544845 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544847 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544850 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544853 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544856 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544858 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544861 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544863 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544866 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544868 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544871 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544874 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544878 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544881 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544883 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.551283 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544887 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544890 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544895 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544898 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544900 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544903 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544905 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544908 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544910 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544913 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544916 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544918 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544921 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544924 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544926 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544929 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544932 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544934 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544937 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544940 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.551742 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544943 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544946 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.544948 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.544956 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.551302 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.551319 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551367 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551372 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551375 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551378 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551382 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551386 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551389 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551392 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551395 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551398 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.552333 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551401 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551404 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551406 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551409 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551412 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551415 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551417 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551421 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551423 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551426 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551429 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551431 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551434 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551436 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551439 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551442 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551445 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551448 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551450 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551453 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.552728 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551456 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551459 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551462 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551465 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551467 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551470 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551473 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551475 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551478 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551480 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551483 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551485 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551488 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551491 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551493 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551496 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551499 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551501 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551504 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.553237 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551506 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551509 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551513 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551516 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551520 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551522 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551525 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551528 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551531 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551535 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551538 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551540 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551543 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551545 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551549 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551552 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551555 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551559 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551562 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.553694 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551565 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551567 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551570 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551572 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551575 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551577 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551580 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551583 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551585 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551588 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551590 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551593 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551595 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551597 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551600 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551602 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551605 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.554156 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551607 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.551612 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551710 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551715 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551718 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551720 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551728 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551731 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551733 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551736 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551739 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551742 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551744 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551747 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551750 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551752 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.554570 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551755 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551758 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551760 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551763 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551766 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551768 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551771 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551773 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551776 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551778 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551781 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551783 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551786 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551789 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551791 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551795 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551797 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551800 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551802 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551805 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.554962 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551808 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551810 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551813 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551819 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551822 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551824 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551827 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551830 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551832 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551835 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551838 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551840 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551843 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551847 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551850 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551853 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551855 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551858 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551860 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.555459 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551863 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551865 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551868 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551870 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551873 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551875 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551878 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551880 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551883 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551885 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551888 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551890 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551893 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551895 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551898 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551909 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551912 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551918 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551921 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551924 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.555963 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551926 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551929 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551932 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551936 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551938 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551941 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551944 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551946 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551949 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551951 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551954 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551956 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:23.551959 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.551964 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:23.556560 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.552682 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:23.556913 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.555756 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:23.557051 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.557037 2564 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:23.557167 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.557152 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:23.557215 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.557184 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:23.584877 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.584853 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:23.589544 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.589521 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:23.609605 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.609579 2564 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:23.616305 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.616286 2564 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:23.617541 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.617516 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:23.617635 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.617601 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:23.621509 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.621482 2564 fs.go:135] Filesystem UUIDs: map[1c1e9204-9484-400e-9301-f890a154fff2:/dev/nvme0n1p4 2aab659e-4765-4d2e-ad1e-a80d4c0921c5:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 13:59:23.621571 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.621509 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:23.626688 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.626577 2564 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:23.625288039 +0000 UTC m=+0.444653381 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3055517 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28cfda5d4bb0aba3d5eabaef6263e3 SystemUUID:ec28cfda-5d4b-b0ab-a3d5-eabaef6263e3 BootID:a618d276-985e-4080-b90a-07c98c8ec8e3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:21:5d:56:97:57 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:21:5d:56:97:57 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:aa:00:f2:d6:4a:85 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:23.626688 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.626681 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:23.626796 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.626765 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:23.628390 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.628366 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:23.628552 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.628393 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-244.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:23.628595 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.628564 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:23.628595 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.628574 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:23.628595 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.628592 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:23.628682 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.628602 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:23.630683 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.630671 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:23.630798 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.630789 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:23.633254 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.633196 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:23.633305 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.633268 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:23.633305 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.633284 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:23.633305 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.633294 2564 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:23.633305 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.633302 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:23.634501 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.634489 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:23.634549 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.634509 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:23.637394 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.637379 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:23.637851 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.637835 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hfb6d" Apr 16 13:59:23.639474 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.639459 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:23.640892 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640866 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:23.640892 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640884 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:23.640892 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640890 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:23.640892 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640896 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:23.640892 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640902 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:23.641106 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640908 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:23.641106 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640913 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:23.641106 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640919 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:23.641106 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640938 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:23.641106 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640945 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:23.641106 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640960 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:23.641106 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.640969 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:23.641877 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.641867 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:23.641877 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.641878 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:23.645254 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.645239 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-244.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:23.645515 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.645500 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:23.645595 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.645533 2564 server.go:1295] "Started kubelet" Apr 16 13:59:23.645595 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.645574 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-244.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:23.645595 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.645580 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:23.645727 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.645633 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:23.645727 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.645642 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:23.645727 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.645703 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:23.646182 ip-10-0-140-244 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:23.646887 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.646868 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:23.646957 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.646940 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hfb6d" Apr 16 13:59:23.647059 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.646975 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:23.656195 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.656161 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:23.656721 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.656708 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:23.657443 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.657417 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:23.657443 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.657443 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:23.657580 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.657499 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:23.657580 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.657555 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:23.657580 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.657566 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:23.657742 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.657644 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:23.660992 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.660955 2564 factory.go:55] Registering systemd factory Apr 16 13:59:23.660992 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.660976 2564 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:23.661241 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.661228 2564 factory.go:153] Registering CRI-O factory Apr 16 13:59:23.661319 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.661311 2564 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:23.661505 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.661495 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:23.661602 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.661594 2564 factory.go:103] Registering Raw factory Apr 16 13:59:23.661673 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.661666 2564 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:23.662139 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.662122 2564 manager.go:319] Starting recovery of all containers Apr 16 13:59:23.662669 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.662639 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:23.665585 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.665464 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:23.668864 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.668835 2564 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-244.ec2.internal\" not found" node="ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.670997 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.670984 2564 manager.go:324] Recovery completed Apr 16 13:59:23.675020 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.675008 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.678334 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.678319 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.678414 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.678346 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.678414 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.678358 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.678851 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.678838 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:23.678851 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.678850 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:23.678928 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.678865 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:23.681645 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.681632 2564 policy_none.go:49] "None policy: Start" Apr 16 13:59:23.681693 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.681650 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:23.681693 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.681661 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.726851 2564 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.726893 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.726904 2564 server.go:85] "Starting device plugin registration server" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.727120 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.727130 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.727327 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.727407 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.727415 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.727762 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:23.729125 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.727802 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:23.767785 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.767752 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:23.768974 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.768958 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:23.769037 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.768991 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:23.769037 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.769015 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:23.769037 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.769024 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:23.769143 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.769121 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:23.771196 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.771175 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:23.828061 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.828031 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.829831 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.829807 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.829944 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.829838 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.829944 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.829850 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.829944 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.829874 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.840189 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.840166 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.840300 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.840191 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-244.ec2.internal\": node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:23.860336 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.860309 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:23.869553 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.869513 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal"] Apr 16 13:59:23.869661 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.869600 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.870982 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.870968 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.871036 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.870997 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.871036 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.871008 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.872395 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.872383 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.872559 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.872544 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.872597 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.872572 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.873078 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.873063 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.873152 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.873089 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.873152 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.873104 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.873911 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.873894 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.873998 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.873924 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.873998 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.873934 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.874719 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.874704 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.874774 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.874728 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.875462 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.875446 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.875536 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.875470 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.875536 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.875484 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.903864 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.903845 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-244.ec2.internal\" not found" node="ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.908130 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.908113 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-244.ec2.internal\" not found" node="ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.959787 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.959754 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/63b791a831a4a165ef869d82724ec61e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal\" (UID: \"63b791a831a4a165ef869d82724ec61e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.959787 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.959784 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63b791a831a4a165ef869d82724ec61e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal\" (UID: \"63b791a831a4a165ef869d82724ec61e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.959984 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:23.959801 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ef025c6bf68cc466efdd3ff573ac22e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-244.ec2.internal\" (UID: \"0ef025c6bf68cc466efdd3ff573ac22e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal" Apr 16 13:59:23.960787 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:23.960769 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.060857 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.060824 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/63b791a831a4a165ef869d82724ec61e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal\" (UID: \"63b791a831a4a165ef869d82724ec61e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.061016 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.060861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/63b791a831a4a165ef869d82724ec61e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal\" (UID: \"63b791a831a4a165ef869d82724ec61e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.061016 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:24.060828 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.061016 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.060868 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63b791a831a4a165ef869d82724ec61e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal\" (UID: \"63b791a831a4a165ef869d82724ec61e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.061016 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.060904 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ef025c6bf68cc466efdd3ff573ac22e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-244.ec2.internal\" (UID: \"0ef025c6bf68cc466efdd3ff573ac22e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.061016 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.060923 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ef025c6bf68cc466efdd3ff573ac22e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-244.ec2.internal\" (UID: \"0ef025c6bf68cc466efdd3ff573ac22e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.061016 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.060926 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63b791a831a4a165ef869d82724ec61e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal\" (UID: \"63b791a831a4a165ef869d82724ec61e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.161611 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:24.161540 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.207744 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.207709 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.210622 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.210609 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.261759 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:24.261726 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.362121 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:24.362087 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.462609 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:24.462531 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.556818 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.556776 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:24.557515 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.556972 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:24.557515 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.556984 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:24.563018 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:24.562996 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.650476 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.650446 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:23 +0000 UTC" deadline="2027-11-05 16:47:02.051710575 +0000 UTC" Apr 16 13:59:24.650476 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.650474 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13634h47m37.401239488s" Apr 16 13:59:24.656488 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.656473 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:24.663632 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:24.663613 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.668577 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.668557 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:24.688509 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.688488 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vz8bf" Apr 16 13:59:24.696581 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.696562 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vz8bf" Apr 16 13:59:24.764382 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:24.764324 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.864561 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:24.864524 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-244.ec2.internal\" not found" Apr 16 13:59:24.887142 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.887114 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:24.954510 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.954486 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:24.957747 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.957729 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.969501 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.969482 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:24.970293 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.970281 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal" Apr 16 13:59:24.976692 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:24.976679 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:25.154105 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:25.153944 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef025c6bf68cc466efdd3ff573ac22e.slice/crio-4966113eedb38bcb2025091a9c7212830f066057bcd356639ce6a42e01bc2af7 WatchSource:0}: Error finding container 4966113eedb38bcb2025091a9c7212830f066057bcd356639ce6a42e01bc2af7: Status 404 returned error can't find the container with id 4966113eedb38bcb2025091a9c7212830f066057bcd356639ce6a42e01bc2af7 Apr 16 13:59:25.158586 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.158572 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:25.322315 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:25.322287 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63b791a831a4a165ef869d82724ec61e.slice/crio-07f8173781d1da07d16a43bb4fd67a65784b42639516bcbcc886297bca1089c2 WatchSource:0}: Error finding container 07f8173781d1da07d16a43bb4fd67a65784b42639516bcbcc886297bca1089c2: Status 404 returned error can't find the container with id 07f8173781d1da07d16a43bb4fd67a65784b42639516bcbcc886297bca1089c2 Apr 16 13:59:25.634773 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.634741 2564 apiserver.go:52] "Watching apiserver" Apr 16 13:59:25.639898 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.639875 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:25.641070 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.641051 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-7fsvb","openshift-image-registry/node-ca-m6p2z","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal","openshift-multus/multus-additional-cni-plugins-mxtjl","openshift-network-operator/iptables-alerter-zfvvf","openshift-ovn-kubernetes/ovnkube-node-cc429","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f","openshift-dns/node-resolver-qzz48","openshift-multus/multus-926vk","openshift-multus/network-metrics-daemon-5g5kq","openshift-network-diagnostics/network-check-target-mmfcz","kube-system/konnectivity-agent-rnkfc","kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal"] Apr 16 13:59:25.644409 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.644394 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.644478 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.644464 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.647545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.645971 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.647545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.647371 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:25.647545 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.647452 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.647744 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.647599 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:25.648228 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.647894 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xbllr\"" Apr 16 13:59:25.648228 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.647917 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.648228 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.647949 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.648228 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.648084 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.648228 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.648101 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mj4m8\"" Apr 16 13:59:25.648228 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.648185 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:25.648541 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.648469 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.648851 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.648807 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:25.648957 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.648867 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-n5s5b\"" Apr 16 13:59:25.648957 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.648839 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:25.648957 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.648939 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.649158 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.649069 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.649945 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.649580 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.651050 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.651031 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.651730 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.651711 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.651878 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.651857 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.651878 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.651859 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vqjl5\"" Apr 16 13:59:25.652027 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.651890 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:25.652428 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.652410 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:25.652505 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.652475 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:25.652505 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.652481 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:25.652505 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.652499 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.652719 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.652707 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hgr6q\"" Apr 16 13:59:25.653219 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.653189 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.653408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.653390 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:25.653484 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.653398 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.653721 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.653701 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-s8lpb\"" Apr 16 13:59:25.653797 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.653765 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.654086 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.654072 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.654297 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.654237 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-926vk" Apr 16 13:59:25.655539 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.655522 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:25.655626 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:25.655577 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:25.656195 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.656179 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-25hf8\"" Apr 16 13:59:25.656268 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.656181 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:25.656268 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.656220 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.656349 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.656311 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.656431 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.656420 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wnt4r\"" Apr 16 13:59:25.656859 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.656845 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:25.656920 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:25.656892 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:25.658274 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.658256 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:25.660339 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.660323 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9mjv8\"" Apr 16 13:59:25.660448 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.660405 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:25.660557 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.660544 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:25.666780 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666764 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fbd7ea5-6aaf-4f3c-a34c-596936befdcf-konnectivity-ca\") pod \"konnectivity-agent-rnkfc\" (UID: \"9fbd7ea5-6aaf-4f3c-a34c-596936befdcf\") " pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:25.666851 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666786 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2qqq\" (UniqueName: \"kubernetes.io/projected/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-kube-api-access-x2qqq\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.666851 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666809 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-multus-cni-dir\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.666851 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666826 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-run-multus-certs\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.666851 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666842 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-kubelet\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667010 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666863 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-etc-openvswitch\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667010 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666900 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-run-openvswitch\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667010 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666948 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-lib-modules\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.667010 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666977 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-var-lib-cni-multus\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.667190 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.666998 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-os-release\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.667190 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667033 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667190 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667081 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b897edab-8b5c-4c47-bede-ddfcf288c0ea-env-overrides\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667190 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667115 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e84f9b0d-275b-49e4-a053-32d964b9ff96-tmp\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.667190 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667142 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-registration-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.667190 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667170 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-device-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667195 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-var-lib-cni-bin\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667236 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667264 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667287 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khvtl\" (UniqueName: \"kubernetes.io/projected/e84f9b0d-275b-49e4-a053-32d964b9ff96-kube-api-access-khvtl\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667312 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-etc-kubernetes\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667335 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-node-log\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667357 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fbd7ea5-6aaf-4f3c-a34c-596936befdcf-agent-certs\") pod \"konnectivity-agent-rnkfc\" (UID: \"9fbd7ea5-6aaf-4f3c-a34c-596936befdcf\") " pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667381 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-sysctl-conf\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667400 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e8196db-0ad9-4936-a33e-c935a2815b53-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667414 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/82414c16-8ba5-4b42-9bf4-8ef65317ee29-iptables-alerter-script\") pod \"iptables-alerter-zfvvf\" (UID: \"82414c16-8ba5-4b42-9bf4-8ef65317ee29\") " pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.667452 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667435 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82414c16-8ba5-4b42-9bf4-8ef65317ee29-host-slash\") pod \"iptables-alerter-zfvvf\" (UID: \"82414c16-8ba5-4b42-9bf4-8ef65317ee29\") " pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667449 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-cnibin\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667480 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bttw\" (UniqueName: \"kubernetes.io/projected/7e8196db-0ad9-4936-a33e-c935a2815b53-kube-api-access-5bttw\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667498 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-run-netns\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667512 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-run-systemd\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667531 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b897edab-8b5c-4c47-bede-ddfcf288c0ea-ovnkube-config\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667560 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667589 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-systemd\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667612 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-system-cni-dir\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667637 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4hx\" (UniqueName: \"kubernetes.io/projected/1e3a02e8-43e4-4f89-a584-53fbf95d94cf-kube-api-access-9k4hx\") pod \"node-ca-m6p2z\" (UID: \"1e3a02e8-43e4-4f89-a584-53fbf95d94cf\") " pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667659 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-cni-bin\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667691 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-os-release\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667712 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-run-netns\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667736 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7e8196db-0ad9-4936-a33e-c935a2815b53-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667766 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-systemd-units\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667814 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-var-lib-openvswitch\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.667922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667838 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-sysconfig\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667875 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-socket-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667923 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69e2944f-6e8b-40c7-be64-0a3d77f6c3fd-tmp-dir\") pod \"node-resolver-qzz48\" (UID: \"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd\") " pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667960 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/38c85021-52e8-4534-bae8-801408d6b6f1-multus-daemon-config\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.667990 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-sys\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668016 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-run-ovn\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668042 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-system-cni-dir\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668067 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-kubernetes\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668089 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-var-lib-kubelet\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668111 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-run-k8s-cni-cncf-io\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668132 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-var-lib-kubelet\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668154 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b897edab-8b5c-4c47-bede-ddfcf288c0ea-ovn-node-metrics-cert\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668176 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-run\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668198 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-cnibin\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668245 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e3a02e8-43e4-4f89-a584-53fbf95d94cf-host\") pod \"node-ca-m6p2z\" (UID: \"1e3a02e8-43e4-4f89-a584-53fbf95d94cf\") " pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668271 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlrlg\" (UniqueName: \"kubernetes.io/projected/82414c16-8ba5-4b42-9bf4-8ef65317ee29-kube-api-access-zlrlg\") pod \"iptables-alerter-zfvvf\" (UID: \"82414c16-8ba5-4b42-9bf4-8ef65317ee29\") " pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668303 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-cni-netd\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.668439 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668329 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668351 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-sysctl-d\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668372 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-tuned\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668392 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzt79\" (UniqueName: \"kubernetes.io/projected/69e2944f-6e8b-40c7-be64-0a3d77f6c3fd-kube-api-access-xzt79\") pod \"node-resolver-qzz48\" (UID: \"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd\") " pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668409 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-hostroot\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668432 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668455 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6phpg\" (UniqueName: \"kubernetes.io/projected/38c85021-52e8-4534-bae8-801408d6b6f1-kube-api-access-6phpg\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668480 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e8196db-0ad9-4936-a33e-c935a2815b53-cni-binary-copy\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668505 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b897edab-8b5c-4c47-bede-ddfcf288c0ea-ovnkube-script-lib\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668540 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-modprobe-d\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668565 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-host\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668580 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38c85021-52e8-4534-bae8-801408d6b6f1-cni-binary-copy\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668594 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-multus-socket-dir-parent\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668614 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668633 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-slash\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668652 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxsw\" (UniqueName: \"kubernetes.io/projected/b897edab-8b5c-4c47-bede-ddfcf288c0ea-kube-api-access-nbxsw\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.668999 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668672 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-sys-fs\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.669451 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668692 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/69e2944f-6e8b-40c7-be64-0a3d77f6c3fd-hosts-file\") pod \"node-resolver-qzz48\" (UID: \"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd\") " pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.669451 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668705 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-multus-conf-dir\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.669451 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668726 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1e3a02e8-43e4-4f89-a584-53fbf95d94cf-serviceca\") pod \"node-ca-m6p2z\" (UID: \"1e3a02e8-43e4-4f89-a584-53fbf95d94cf\") " pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.669451 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668755 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-log-socket\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.669451 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.668776 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwg2q\" (UniqueName: \"kubernetes.io/projected/2bdb9ab9-1a72-487e-8b6c-732d544d0454-kube-api-access-fwg2q\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:25.688143 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.688122 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:25.698996 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.698973 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:24 +0000 UTC" deadline="2027-11-27 02:37:42.869009829 +0000 UTC" Apr 16 13:59:25.698996 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.698994 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14148h38m17.170018557s" Apr 16 13:59:25.758872 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.758843 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:25.769357 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769329 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-os-release\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.769441 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769371 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-run-netns\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.769441 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769400 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7e8196db-0ad9-4936-a33e-c935a2815b53-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.769441 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769427 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-systemd-units\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.769584 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769451 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-var-lib-openvswitch\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.769584 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769459 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-run-netns\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.769584 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769476 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-sysconfig\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.769584 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769493 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-systemd-units\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.769584 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769459 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-os-release\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.769584 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769518 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-socket-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.769584 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769557 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-var-lib-openvswitch\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.769584 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769526 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-sysconfig\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769595 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69e2944f-6e8b-40c7-be64-0a3d77f6c3fd-tmp-dir\") pod \"node-resolver-qzz48\" (UID: \"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd\") " pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769627 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/38c85021-52e8-4534-bae8-801408d6b6f1-multus-daemon-config\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769649 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-sys\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769673 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-run-ovn\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769698 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-system-cni-dir\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769723 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-kubernetes\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769748 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-var-lib-kubelet\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769771 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-run-k8s-cni-cncf-io\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769775 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-socket-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769778 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-system-cni-dir\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769795 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-var-lib-kubelet\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769817 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b897edab-8b5c-4c47-bede-ddfcf288c0ea-ovn-node-metrics-cert\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769831 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-sys\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-run\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769850 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-var-lib-kubelet\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769865 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-cnibin\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769882 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-kubernetes\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.769915 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769889 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e3a02e8-43e4-4f89-a584-53fbf95d94cf-host\") pod \"node-ca-m6p2z\" (UID: \"1e3a02e8-43e4-4f89-a584-53fbf95d94cf\") " pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769894 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-var-lib-kubelet\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769914 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlrlg\" (UniqueName: \"kubernetes.io/projected/82414c16-8ba5-4b42-9bf4-8ef65317ee29-kube-api-access-zlrlg\") pod \"iptables-alerter-zfvvf\" (UID: \"82414c16-8ba5-4b42-9bf4-8ef65317ee29\") " pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769920 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-run-k8s-cni-cncf-io\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769939 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-cni-netd\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769942 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-run\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769958 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e3a02e8-43e4-4f89-a584-53fbf95d94cf-host\") pod \"node-ca-m6p2z\" (UID: \"1e3a02e8-43e4-4f89-a584-53fbf95d94cf\") " pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.769963 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770002 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7e8196db-0ad9-4936-a33e-c935a2815b53-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770026 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-sysctl-d\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770006 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-cnibin\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-tuned\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770103 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzt79\" (UniqueName: \"kubernetes.io/projected/69e2944f-6e8b-40c7-be64-0a3d77f6c3fd-kube-api-access-xzt79\") pod \"node-resolver-qzz48\" (UID: \"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd\") " pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770126 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-hostroot\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770152 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770178 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6phpg\" (UniqueName: \"kubernetes.io/projected/38c85021-52e8-4534-bae8-801408d6b6f1-kube-api-access-6phpg\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770180 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-sysctl-d\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.770772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770196 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/38c85021-52e8-4534-bae8-801408d6b6f1-multus-daemon-config\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770241 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e8196db-0ad9-4936-a33e-c935a2815b53-cni-binary-copy\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770281 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b897edab-8b5c-4c47-bede-ddfcf288c0ea-ovnkube-script-lib\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770301 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770308 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-modprobe-d\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770355 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-host\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770370 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69e2944f-6e8b-40c7-be64-0a3d77f6c3fd-tmp-dir\") pod \"node-resolver-qzz48\" (UID: \"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd\") " pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770380 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38c85021-52e8-4534-bae8-801408d6b6f1-cni-binary-copy\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770405 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-multus-socket-dir-parent\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770435 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770470 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-host\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770466 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-slash\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770504 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-multus-socket-dir-parent\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770515 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxsw\" (UniqueName: \"kubernetes.io/projected/b897edab-8b5c-4c47-bede-ddfcf288c0ea-kube-api-access-nbxsw\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770562 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-sys-fs\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770590 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/69e2944f-6e8b-40c7-be64-0a3d77f6c3fd-hosts-file\") pod \"node-resolver-qzz48\" (UID: \"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd\") " pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770613 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-multus-conf-dir\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770636 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1e3a02e8-43e4-4f89-a584-53fbf95d94cf-serviceca\") pod \"node-ca-m6p2z\" (UID: \"1e3a02e8-43e4-4f89-a584-53fbf95d94cf\") " pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.771461 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770640 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-run-ovn\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770660 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-log-socket\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770666 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e8196db-0ad9-4936-a33e-c935a2815b53-cni-binary-copy\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770684 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwg2q\" (UniqueName: \"kubernetes.io/projected/2bdb9ab9-1a72-487e-8b6c-732d544d0454-kube-api-access-fwg2q\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770712 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fbd7ea5-6aaf-4f3c-a34c-596936befdcf-konnectivity-ca\") pod \"konnectivity-agent-rnkfc\" (UID: \"9fbd7ea5-6aaf-4f3c-a34c-596936befdcf\") " pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770758 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2qqq\" (UniqueName: \"kubernetes.io/projected/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-kube-api-access-x2qqq\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770780 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-modprobe-d\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770784 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-multus-cni-dir\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770840 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-multus-cni-dir\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770850 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-run-multus-certs\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770878 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-kubelet\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770914 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-run-multus-certs\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770917 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-etc-openvswitch\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770945 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-run-openvswitch\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770957 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38c85021-52e8-4534-bae8-801408d6b6f1-cni-binary-copy\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770968 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770972 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-lib-modules\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.772278 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770997 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-cni-netd\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771003 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-var-lib-cni-multus\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771026 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-os-release\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771031 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b897edab-8b5c-4c47-bede-ddfcf288c0ea-ovnkube-script-lib\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771046 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-slash\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771052 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771057 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-lib-modules\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771053 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-etc-openvswitch\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771090 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b897edab-8b5c-4c47-bede-ddfcf288c0ea-env-overrides\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771101 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-var-lib-cni-multus\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771102 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-run-openvswitch\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771117 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e84f9b0d-275b-49e4-a053-32d964b9ff96-tmp\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771144 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-registration-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771150 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-log-socket\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771160 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-os-release\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771169 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-device-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771194 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-var-lib-cni-bin\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.772966 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771199 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771239 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771260 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-sys-fs\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771268 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771294 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khvtl\" (UniqueName: \"kubernetes.io/projected/e84f9b0d-275b-49e4-a053-32d964b9ff96-kube-api-access-khvtl\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-etc-kubernetes\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771360 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-node-log\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771399 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fbd7ea5-6aaf-4f3c-a34c-596936befdcf-agent-certs\") pod \"konnectivity-agent-rnkfc\" (UID: \"9fbd7ea5-6aaf-4f3c-a34c-596936befdcf\") " pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771393 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-kubelet\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771426 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-sysctl-conf\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771473 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e8196db-0ad9-4936-a33e-c935a2815b53-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771503 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/82414c16-8ba5-4b42-9bf4-8ef65317ee29-iptables-alerter-script\") pod \"iptables-alerter-zfvvf\" (UID: \"82414c16-8ba5-4b42-9bf4-8ef65317ee29\") " pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771524 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/69e2944f-6e8b-40c7-be64-0a3d77f6c3fd-hosts-file\") pod \"node-resolver-qzz48\" (UID: \"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd\") " pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771528 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82414c16-8ba5-4b42-9bf4-8ef65317ee29-host-slash\") pod \"iptables-alerter-zfvvf\" (UID: \"82414c16-8ba5-4b42-9bf4-8ef65317ee29\") " pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771584 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-cnibin\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771608 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bttw\" (UniqueName: \"kubernetes.io/projected/7e8196db-0ad9-4936-a33e-c935a2815b53-kube-api-access-5bttw\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771652 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-run-netns\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.773711 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771681 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-run-systemd\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771704 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b897edab-8b5c-4c47-bede-ddfcf288c0ea-ovnkube-config\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771745 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771772 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-systemd\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771814 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-system-cni-dir\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771844 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k4hx\" (UniqueName: \"kubernetes.io/projected/1e3a02e8-43e4-4f89-a584-53fbf95d94cf-kube-api-access-9k4hx\") pod \"node-ca-m6p2z\" (UID: \"1e3a02e8-43e4-4f89-a584-53fbf95d94cf\") " pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771917 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-cni-bin\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771937 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771117 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-multus-conf-dir\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.771527 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-sysctl-conf\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772009 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-cnibin\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772009 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-registration-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772036 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-cni-bin\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772084 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/82414c16-8ba5-4b42-9bf4-8ef65317ee29-iptables-alerter-script\") pod \"iptables-alerter-zfvvf\" (UID: \"82414c16-8ba5-4b42-9bf4-8ef65317ee29\") " pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772090 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-device-dir\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772122 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-host-var-lib-cni-bin\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.770713 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-hostroot\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:25.772231 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.774408 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772256 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-systemd\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:25.772302 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs podName:2bdb9ab9-1a72-487e-8b6c-732d544d0454 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:26.272272608 +0000 UTC m=+3.091637947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs") pod "network-metrics-daemon-5g5kq" (UID: "2bdb9ab9-1a72-487e-8b6c-732d544d0454") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772304 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-system-cni-dir\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772361 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-run-systemd\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772456 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1e3a02e8-43e4-4f89-a584-53fbf95d94cf-serviceca\") pod \"node-ca-m6p2z\" (UID: \"1e3a02e8-43e4-4f89-a584-53fbf95d94cf\") " pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772547 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-host-run-netns\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772591 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38c85021-52e8-4534-bae8-801408d6b6f1-etc-kubernetes\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772701 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772750 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e8196db-0ad9-4936-a33e-c935a2815b53-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772911 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b897edab-8b5c-4c47-bede-ddfcf288c0ea-node-log\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.772968 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b897edab-8b5c-4c47-bede-ddfcf288c0ea-ovnkube-config\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.773081 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b897edab-8b5c-4c47-bede-ddfcf288c0ea-env-overrides\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.773152 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82414c16-8ba5-4b42-9bf4-8ef65317ee29-host-slash\") pod \"iptables-alerter-zfvvf\" (UID: \"82414c16-8ba5-4b42-9bf4-8ef65317ee29\") " pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.773222 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fbd7ea5-6aaf-4f3c-a34c-596936befdcf-konnectivity-ca\") pod \"konnectivity-agent-rnkfc\" (UID: \"9fbd7ea5-6aaf-4f3c-a34c-596936befdcf\") " pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.774121 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e84f9b0d-275b-49e4-a053-32d964b9ff96-tmp\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.774149 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e8196db-0ad9-4936-a33e-c935a2815b53-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.774193 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e84f9b0d-275b-49e4-a053-32d964b9ff96-etc-tuned\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.775087 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.774281 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b897edab-8b5c-4c47-bede-ddfcf288c0ea-ovn-node-metrics-cert\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.775833 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.775167 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" event={"ID":"63b791a831a4a165ef869d82724ec61e","Type":"ContainerStarted","Data":"07f8173781d1da07d16a43bb4fd67a65784b42639516bcbcc886297bca1089c2"} Apr 16 13:59:25.776675 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.776651 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal" event={"ID":"0ef025c6bf68cc466efdd3ff573ac22e","Type":"ContainerStarted","Data":"4966113eedb38bcb2025091a9c7212830f066057bcd356639ce6a42e01bc2af7"} Apr 16 13:59:25.778828 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.778807 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fbd7ea5-6aaf-4f3c-a34c-596936befdcf-agent-certs\") pod \"konnectivity-agent-rnkfc\" (UID: \"9fbd7ea5-6aaf-4f3c-a34c-596936befdcf\") " pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:25.780298 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.780279 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlrlg\" (UniqueName: \"kubernetes.io/projected/82414c16-8ba5-4b42-9bf4-8ef65317ee29-kube-api-access-zlrlg\") pod \"iptables-alerter-zfvvf\" (UID: \"82414c16-8ba5-4b42-9bf4-8ef65317ee29\") " pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.780367 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:25.780329 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:25.780367 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:25.780346 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:25.780367 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:25.780355 2564 projected.go:194] Error preparing data for projected volume kube-api-access-gs9m5 for pod openshift-network-diagnostics/network-check-target-mmfcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:25.780479 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:25.780401 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5 podName:1ec67a7e-5f50-42d0-b878-f6ddc3826470 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:26.280388437 +0000 UTC m=+3.099753785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gs9m5" (UniqueName: "kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5") pod "network-check-target-mmfcz" (UID: "1ec67a7e-5f50-42d0-b878-f6ddc3826470") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:25.783300 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.783267 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6phpg\" (UniqueName: \"kubernetes.io/projected/38c85021-52e8-4534-bae8-801408d6b6f1-kube-api-access-6phpg\") pod \"multus-926vk\" (UID: \"38c85021-52e8-4534-bae8-801408d6b6f1\") " pod="openshift-multus/multus-926vk" Apr 16 13:59:25.783523 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.783501 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzt79\" (UniqueName: \"kubernetes.io/projected/69e2944f-6e8b-40c7-be64-0a3d77f6c3fd-kube-api-access-xzt79\") pod \"node-resolver-qzz48\" (UID: \"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd\") " pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.783838 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.783818 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k4hx\" (UniqueName: \"kubernetes.io/projected/1e3a02e8-43e4-4f89-a584-53fbf95d94cf-kube-api-access-9k4hx\") pod \"node-ca-m6p2z\" (UID: \"1e3a02e8-43e4-4f89-a584-53fbf95d94cf\") " pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.783932 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.783838 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bttw\" (UniqueName: \"kubernetes.io/projected/7e8196db-0ad9-4936-a33e-c935a2815b53-kube-api-access-5bttw\") pod \"multus-additional-cni-plugins-mxtjl\" (UID: \"7e8196db-0ad9-4936-a33e-c935a2815b53\") " pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.784342 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.784314 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwg2q\" (UniqueName: \"kubernetes.io/projected/2bdb9ab9-1a72-487e-8b6c-732d544d0454-kube-api-access-fwg2q\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:25.784423 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.784407 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2qqq\" (UniqueName: \"kubernetes.io/projected/e1cf92f2-1bbb-4464-acfa-20a13119b6f4-kube-api-access-x2qqq\") pod \"aws-ebs-csi-driver-node-c2z7f\" (UID: \"e1cf92f2-1bbb-4464-acfa-20a13119b6f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.784845 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.784820 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxsw\" (UniqueName: \"kubernetes.io/projected/b897edab-8b5c-4c47-bede-ddfcf288c0ea-kube-api-access-nbxsw\") pod \"ovnkube-node-cc429\" (UID: \"b897edab-8b5c-4c47-bede-ddfcf288c0ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.784922 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.784865 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khvtl\" (UniqueName: \"kubernetes.io/projected/e84f9b0d-275b-49e4-a053-32d964b9ff96-kube-api-access-khvtl\") pod \"tuned-7fsvb\" (UID: \"e84f9b0d-275b-49e4-a053-32d964b9ff96\") " pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.955362 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.955274 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" Apr 16 13:59:25.962155 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.962122 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m6p2z" Apr 16 13:59:25.962797 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:25.962756 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1cf92f2_1bbb_4464_acfa_20a13119b6f4.slice/crio-9b2456c148f0f3ccb7097b1d733f58adaada6e4dab61baf5c0f6d7019adb2880 WatchSource:0}: Error finding container 9b2456c148f0f3ccb7097b1d733f58adaada6e4dab61baf5c0f6d7019adb2880: Status 404 returned error can't find the container with id 9b2456c148f0f3ccb7097b1d733f58adaada6e4dab61baf5c0f6d7019adb2880 Apr 16 13:59:25.968372 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.968351 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" Apr 16 13:59:25.969436 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:25.969391 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e3a02e8_43e4_4f89_a584_53fbf95d94cf.slice/crio-b0eea87755442dc958556271a4a2f9971e5b069a24224f193d7232f640c87fe4 WatchSource:0}: Error finding container b0eea87755442dc958556271a4a2f9971e5b069a24224f193d7232f640c87fe4: Status 404 returned error can't find the container with id b0eea87755442dc958556271a4a2f9971e5b069a24224f193d7232f640c87fe4 Apr 16 13:59:25.972318 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.972297 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zfvvf" Apr 16 13:59:25.975622 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:25.975588 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8196db_0ad9_4936_a33e_c935a2815b53.slice/crio-fada05069d61095bc4b45ba9cdaeb124212e1b23ca6759946903c360520560b6 WatchSource:0}: Error finding container fada05069d61095bc4b45ba9cdaeb124212e1b23ca6759946903c360520560b6: Status 404 returned error can't find the container with id fada05069d61095bc4b45ba9cdaeb124212e1b23ca6759946903c360520560b6 Apr 16 13:59:25.980454 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.980426 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:25.980842 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:25.980821 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82414c16_8ba5_4b42_9bf4_8ef65317ee29.slice/crio-dc7a91b1899a3648b5489227b49c816fe3ea6489253f73c1fe2a7b0a8edd9f3a WatchSource:0}: Error finding container dc7a91b1899a3648b5489227b49c816fe3ea6489253f73c1fe2a7b0a8edd9f3a: Status 404 returned error can't find the container with id dc7a91b1899a3648b5489227b49c816fe3ea6489253f73c1fe2a7b0a8edd9f3a Apr 16 13:59:25.985273 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.985251 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" Apr 16 13:59:25.990718 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.990700 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzz48" Apr 16 13:59:25.991779 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:25.991753 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb897edab_8b5c_4c47_bede_ddfcf288c0ea.slice/crio-9d7d14e4faf58137da54330eded57ac164a31d1c3c6db7c74daec9c22186250f WatchSource:0}: Error finding container 9d7d14e4faf58137da54330eded57ac164a31d1c3c6db7c74daec9c22186250f: Status 404 returned error can't find the container with id 9d7d14e4faf58137da54330eded57ac164a31d1c3c6db7c74daec9c22186250f Apr 16 13:59:25.995150 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:25.995123 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84f9b0d_275b_49e4_a053_32d964b9ff96.slice/crio-27465b85427d2b4dff96b591bb5df26c02675a4435cc3a7224153e11336b2b39 WatchSource:0}: Error finding container 27465b85427d2b4dff96b591bb5df26c02675a4435cc3a7224153e11336b2b39: Status 404 returned error can't find the container with id 27465b85427d2b4dff96b591bb5df26c02675a4435cc3a7224153e11336b2b39 Apr 16 13:59:25.995254 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.995216 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-926vk" Apr 16 13:59:25.998492 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:25.998467 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e2944f_6e8b_40c7_be64_0a3d77f6c3fd.slice/crio-fc1be25bc7690658f8475b2f0a4464a73e9a87c30b0036c688e05a0b073f03a9 WatchSource:0}: Error finding container fc1be25bc7690658f8475b2f0a4464a73e9a87c30b0036c688e05a0b073f03a9: Status 404 returned error can't find the container with id fc1be25bc7690658f8475b2f0a4464a73e9a87c30b0036c688e05a0b073f03a9 Apr 16 13:59:25.999428 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:25.999409 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:26.003658 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:26.003626 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38c85021_52e8_4534_bae8_801408d6b6f1.slice/crio-ffb446b1ecbf2ae699bf2b096058f034b5586f47890706b5cf7b6cd44f08b56e WatchSource:0}: Error finding container ffb446b1ecbf2ae699bf2b096058f034b5586f47890706b5cf7b6cd44f08b56e: Status 404 returned error can't find the container with id ffb446b1ecbf2ae699bf2b096058f034b5586f47890706b5cf7b6cd44f08b56e Apr 16 13:59:26.008336 ip-10-0-140-244 kubenswrapper[2564]: W0416 13:59:26.008314 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fbd7ea5_6aaf_4f3c_a34c_596936befdcf.slice/crio-b39f3e5a9690170558076548c852b6e37b23358facb82a6e678a49d4a209e6b1 WatchSource:0}: Error finding container b39f3e5a9690170558076548c852b6e37b23358facb82a6e678a49d4a209e6b1: Status 404 returned error can't find the container with id b39f3e5a9690170558076548c852b6e37b23358facb82a6e678a49d4a209e6b1 Apr 16 13:59:26.053936 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.053892 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:26.274319 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.274221 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:26.274470 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:26.274381 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:26.274470 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:26.274449 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs podName:2bdb9ab9-1a72-487e-8b6c-732d544d0454 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:27.27443508 +0000 UTC m=+4.093800410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs") pod "network-metrics-daemon-5g5kq" (UID: "2bdb9ab9-1a72-487e-8b6c-732d544d0454") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:26.375364 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.375326 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:26.375546 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:26.375515 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:26.375546 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:26.375537 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:26.375636 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:26.375550 2564 projected.go:194] Error preparing data for projected volume kube-api-access-gs9m5 for pod openshift-network-diagnostics/network-check-target-mmfcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:26.375636 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:26.375616 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5 podName:1ec67a7e-5f50-42d0-b878-f6ddc3826470 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:27.375598965 +0000 UTC m=+4.194964295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs9m5" (UniqueName: "kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5") pod "network-check-target-mmfcz" (UID: "1ec67a7e-5f50-42d0-b878-f6ddc3826470") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:26.699746 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.699703 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:24 +0000 UTC" deadline="2027-09-20 17:18:14.594218969 +0000 UTC" Apr 16 13:59:26.699746 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.699742 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12531h18m47.894480411s" Apr 16 13:59:26.793440 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.793327 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zfvvf" event={"ID":"82414c16-8ba5-4b42-9bf4-8ef65317ee29","Type":"ContainerStarted","Data":"dc7a91b1899a3648b5489227b49c816fe3ea6489253f73c1fe2a7b0a8edd9f3a"} Apr 16 13:59:26.801362 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.801295 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" event={"ID":"7e8196db-0ad9-4936-a33e-c935a2815b53","Type":"ContainerStarted","Data":"fada05069d61095bc4b45ba9cdaeb124212e1b23ca6759946903c360520560b6"} Apr 16 13:59:26.810751 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.810671 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m6p2z" event={"ID":"1e3a02e8-43e4-4f89-a584-53fbf95d94cf","Type":"ContainerStarted","Data":"b0eea87755442dc958556271a4a2f9971e5b069a24224f193d7232f640c87fe4"} Apr 16 13:59:26.816277 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.816244 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzz48" event={"ID":"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd","Type":"ContainerStarted","Data":"fc1be25bc7690658f8475b2f0a4464a73e9a87c30b0036c688e05a0b073f03a9"} Apr 16 13:59:26.821074 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.821032 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" event={"ID":"b897edab-8b5c-4c47-bede-ddfcf288c0ea","Type":"ContainerStarted","Data":"9d7d14e4faf58137da54330eded57ac164a31d1c3c6db7c74daec9c22186250f"} Apr 16 13:59:26.827383 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.827325 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" event={"ID":"e1cf92f2-1bbb-4464-acfa-20a13119b6f4","Type":"ContainerStarted","Data":"9b2456c148f0f3ccb7097b1d733f58adaada6e4dab61baf5c0f6d7019adb2880"} Apr 16 13:59:26.831017 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.830991 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rnkfc" event={"ID":"9fbd7ea5-6aaf-4f3c-a34c-596936befdcf","Type":"ContainerStarted","Data":"b39f3e5a9690170558076548c852b6e37b23358facb82a6e678a49d4a209e6b1"} Apr 16 13:59:26.839104 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.839073 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-926vk" event={"ID":"38c85021-52e8-4534-bae8-801408d6b6f1","Type":"ContainerStarted","Data":"ffb446b1ecbf2ae699bf2b096058f034b5586f47890706b5cf7b6cd44f08b56e"} Apr 16 13:59:26.841273 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:26.841247 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" event={"ID":"e84f9b0d-275b-49e4-a053-32d964b9ff96","Type":"ContainerStarted","Data":"27465b85427d2b4dff96b591bb5df26c02675a4435cc3a7224153e11336b2b39"} Apr 16 13:59:27.282585 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:27.282549 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:27.282774 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:27.282730 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:27.282837 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:27.282793 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs podName:2bdb9ab9-1a72-487e-8b6c-732d544d0454 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.28277343 +0000 UTC m=+6.102138763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs") pod "network-metrics-daemon-5g5kq" (UID: "2bdb9ab9-1a72-487e-8b6c-732d544d0454") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:27.383764 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:27.383726 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:27.383964 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:27.383879 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:27.383964 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:27.383932 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:27.383964 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:27.383948 2564 projected.go:194] Error preparing data for projected volume kube-api-access-gs9m5 for pod openshift-network-diagnostics/network-check-target-mmfcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:27.384139 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:27.384009 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5 podName:1ec67a7e-5f50-42d0-b878-f6ddc3826470 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.383989381 +0000 UTC m=+6.203354714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs9m5" (UniqueName: "kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5") pod "network-check-target-mmfcz" (UID: "1ec67a7e-5f50-42d0-b878-f6ddc3826470") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:27.769896 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:27.769478 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:27.769896 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:27.769612 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:27.770418 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:27.770273 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:27.770418 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:27.770368 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:28.860149 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:28.859406 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" event={"ID":"63b791a831a4a165ef869d82724ec61e","Type":"ContainerStarted","Data":"63f634e4c16c8421f9463c46f7423f46ebf6b8f6ac38a5b454aa9944e46adef9"} Apr 16 13:59:29.301401 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:29.301317 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:29.301565 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:29.301458 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:29.301565 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:29.301522 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs podName:2bdb9ab9-1a72-487e-8b6c-732d544d0454 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:33.30150311 +0000 UTC m=+10.120868443 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs") pod "network-metrics-daemon-5g5kq" (UID: "2bdb9ab9-1a72-487e-8b6c-732d544d0454") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:29.402466 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:29.402413 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:29.402639 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:29.402593 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:29.402639 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:29.402616 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:29.402639 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:29.402629 2564 projected.go:194] Error preparing data for projected volume kube-api-access-gs9m5 for pod openshift-network-diagnostics/network-check-target-mmfcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:29.402894 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:29.402694 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5 podName:1ec67a7e-5f50-42d0-b878-f6ddc3826470 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:33.402675251 +0000 UTC m=+10.222040595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs9m5" (UniqueName: "kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5") pod "network-check-target-mmfcz" (UID: "1ec67a7e-5f50-42d0-b878-f6ddc3826470") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:29.769928 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:29.769454 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:29.769928 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:29.769583 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:29.769928 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:29.769643 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:29.769928 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:29.769809 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:31.769429 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:31.769376 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:31.769876 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:31.769501 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:31.769876 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:31.769391 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:31.770001 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:31.769957 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:33.337157 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:33.336721 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:33.337157 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:33.336884 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:33.337157 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:33.336961 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs podName:2bdb9ab9-1a72-487e-8b6c-732d544d0454 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.336941013 +0000 UTC m=+18.156306348 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs") pod "network-metrics-daemon-5g5kq" (UID: "2bdb9ab9-1a72-487e-8b6c-732d544d0454") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:33.438100 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:33.437957 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:33.438328 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:33.438156 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:33.438328 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:33.438175 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:33.438328 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:33.438187 2564 projected.go:194] Error preparing data for projected volume kube-api-access-gs9m5 for pod openshift-network-diagnostics/network-check-target-mmfcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:33.438328 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:33.438273 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5 podName:1ec67a7e-5f50-42d0-b878-f6ddc3826470 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.438255112 +0000 UTC m=+18.257620448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs9m5" (UniqueName: "kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5") pod "network-check-target-mmfcz" (UID: "1ec67a7e-5f50-42d0-b878-f6ddc3826470") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:33.771643 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:33.771063 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:33.771643 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:33.771186 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:33.771853 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:33.771262 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:33.771922 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:33.771865 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:33.870267 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:33.870192 2564 generic.go:358] "Generic (PLEG): container finished" podID="63b791a831a4a165ef869d82724ec61e" containerID="63f634e4c16c8421f9463c46f7423f46ebf6b8f6ac38a5b454aa9944e46adef9" exitCode=0 Apr 16 13:59:33.870453 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:33.870273 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" event={"ID":"63b791a831a4a165ef869d82724ec61e","Type":"ContainerDied","Data":"63f634e4c16c8421f9463c46f7423f46ebf6b8f6ac38a5b454aa9944e46adef9"} Apr 16 13:59:35.769487 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:35.769448 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:35.769899 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:35.769459 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:35.769899 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:35.769571 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:35.769899 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:35.769652 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:37.314077 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.314044 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5dpwb"] Apr 16 13:59:37.348078 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.348049 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:37.348258 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:37.348130 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:37.468507 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.468469 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:37.468772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.468570 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4f657495-2777-434f-9e5e-c076ada48605-kubelet-config\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:37.468772 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.468618 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4f657495-2777-434f-9e5e-c076ada48605-dbus\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:37.569763 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.569669 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:37.569929 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.569767 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4f657495-2777-434f-9e5e-c076ada48605-kubelet-config\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:37.569929 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.569800 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4f657495-2777-434f-9e5e-c076ada48605-dbus\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:37.569929 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:37.569820 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:37.569929 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:37.569888 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret podName:4f657495-2777-434f-9e5e-c076ada48605 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.069869667 +0000 UTC m=+14.889235016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret") pod "global-pull-secret-syncer-5dpwb" (UID: "4f657495-2777-434f-9e5e-c076ada48605") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:37.569929 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.569889 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4f657495-2777-434f-9e5e-c076ada48605-dbus\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:37.569929 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.569894 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4f657495-2777-434f-9e5e-c076ada48605-kubelet-config\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:37.769742 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.769702 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:37.769742 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:37.769743 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:37.769974 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:37.769844 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:37.770035 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:37.769991 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:38.073730 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:38.073695 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:38.073907 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:38.073838 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:38.073967 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:38.073911 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret podName:4f657495-2777-434f-9e5e-c076ada48605 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.073891553 +0000 UTC m=+15.893256883 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret") pod "global-pull-secret-syncer-5dpwb" (UID: "4f657495-2777-434f-9e5e-c076ada48605") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:38.769375 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:38.769337 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:38.769792 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:38.769462 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:39.081045 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:39.081014 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:39.081191 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:39.081157 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:39.081267 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:39.081235 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret podName:4f657495-2777-434f-9e5e-c076ada48605 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.081218538 +0000 UTC m=+17.900583870 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret") pod "global-pull-secret-syncer-5dpwb" (UID: "4f657495-2777-434f-9e5e-c076ada48605") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:39.769496 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:39.769457 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:39.769961 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:39.769459 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:39.769961 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:39.769611 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:39.769961 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:39.769672 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:40.769330 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:40.769296 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:40.769514 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:40.769400 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:41.095537 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:41.095501 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:41.095718 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.095629 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:41.095718 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.095693 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret podName:4f657495-2777-434f-9e5e-c076ada48605 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:45.095675373 +0000 UTC m=+21.915040704 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret") pod "global-pull-secret-syncer-5dpwb" (UID: "4f657495-2777-434f-9e5e-c076ada48605") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:41.397484 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:41.397390 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:41.397639 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.397578 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:41.397700 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.397649 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs podName:2bdb9ab9-1a72-487e-8b6c-732d544d0454 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.397630371 +0000 UTC m=+34.216995701 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs") pod "network-metrics-daemon-5g5kq" (UID: "2bdb9ab9-1a72-487e-8b6c-732d544d0454") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:41.498542 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:41.498497 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:41.498727 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.498705 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:41.498790 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.498736 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:41.498790 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.498750 2564 projected.go:194] Error preparing data for projected volume kube-api-access-gs9m5 for pod openshift-network-diagnostics/network-check-target-mmfcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:41.498889 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.498810 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5 podName:1ec67a7e-5f50-42d0-b878-f6ddc3826470 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.498796603 +0000 UTC m=+34.318161933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs9m5" (UniqueName: "kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5") pod "network-check-target-mmfcz" (UID: "1ec67a7e-5f50-42d0-b878-f6ddc3826470") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:41.769977 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:41.769889 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:41.769977 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:41.769929 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:41.770487 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.770037 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:41.770487 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:41.770238 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:42.770232 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:42.770142 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:42.770645 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:42.770284 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:43.771075 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:43.770853 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:43.772224 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:43.771174 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:43.772224 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:43.771758 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:43.772224 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:43.771843 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:43.890588 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:43.890459 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" event={"ID":"63b791a831a4a165ef869d82724ec61e","Type":"ContainerStarted","Data":"f75696f90fab22ffd53f63f62439d6d72c9bc5b391fdd9eae0425a0b0c7ab51d"} Apr 16 13:59:43.895435 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:43.895405 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rnkfc" event={"ID":"9fbd7ea5-6aaf-4f3c-a34c-596936befdcf","Type":"ContainerStarted","Data":"572e4e296bfd60c686555a85a0e4706a6dbd883ff84cc8479ead24fd593213a5"} Apr 16 13:59:43.904084 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:43.903579 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" event={"ID":"e84f9b0d-275b-49e4-a053-32d964b9ff96","Type":"ContainerStarted","Data":"be7f19e2970f5efd4a3a106c6ac037752bcaba2167aefc51376a227b426a3cb7"} Apr 16 13:59:43.908397 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:43.908266 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-244.ec2.internal" podStartSLOduration=19.908249844 podStartE2EDuration="19.908249844s" podCreationTimestamp="2026-04-16 13:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:43.90749385 +0000 UTC m=+20.726859205" watchObservedRunningTime="2026-04-16 13:59:43.908249844 +0000 UTC m=+20.727615198" Apr 16 13:59:44.770084 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.769852 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:44.770244 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:44.770114 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:44.906449 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.906415 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-926vk" event={"ID":"38c85021-52e8-4534-bae8-801408d6b6f1","Type":"ContainerStarted","Data":"7deaab5f7c45f84bc484bc75c084297cbf08c0eddd03a5bd79bd49d5c485fb52"} Apr 16 13:59:44.907749 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.907721 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zfvvf" event={"ID":"82414c16-8ba5-4b42-9bf4-8ef65317ee29","Type":"ContainerStarted","Data":"f3741318021c467215ad734f66bbe995c53f0b1067388bf0a26d4d51f180234d"} Apr 16 13:59:44.909068 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.909047 2564 generic.go:358] "Generic (PLEG): container finished" podID="7e8196db-0ad9-4936-a33e-c935a2815b53" containerID="b4051a15d8e58b301861ad9ae334718212d40b5092352c76eb14c94d16fa724d" exitCode=0 Apr 16 13:59:44.909164 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.909123 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" event={"ID":"7e8196db-0ad9-4936-a33e-c935a2815b53","Type":"ContainerDied","Data":"b4051a15d8e58b301861ad9ae334718212d40b5092352c76eb14c94d16fa724d"} Apr 16 13:59:44.910421 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.910404 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m6p2z" event={"ID":"1e3a02e8-43e4-4f89-a584-53fbf95d94cf","Type":"ContainerStarted","Data":"9af16fe79dff05a88af215d6e99fd77d845ba70037ae58d6dee476069eb9e6c3"} Apr 16 13:59:44.911791 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.911759 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal" event={"ID":"0ef025c6bf68cc466efdd3ff573ac22e","Type":"ContainerStarted","Data":"2fdf212134cc5cdd9f481a01efae36f76ebcbd93e9d4b87e84f7286af118520e"} Apr 16 13:59:44.912970 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.912948 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzz48" event={"ID":"69e2944f-6e8b-40c7-be64-0a3d77f6c3fd","Type":"ContainerStarted","Data":"8937edf75c936486bee1c525e21beadbd6c0dd700fdb11de2ad7b02a3d6b2f40"} Apr 16 13:59:44.918365 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.918341 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" event={"ID":"b897edab-8b5c-4c47-bede-ddfcf288c0ea","Type":"ContainerStarted","Data":"2931b5b0e618b6fc0c49ae5040fbd2385627432423d7e421868be7de6f40b07e"} Apr 16 13:59:44.918365 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.918364 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" event={"ID":"b897edab-8b5c-4c47-bede-ddfcf288c0ea","Type":"ContainerStarted","Data":"ab2866d07cb091e6a2791fbbb0b77981efad9222e87f7d87a0a82267d83b7e4f"} Apr 16 13:59:44.918489 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.918373 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" event={"ID":"b897edab-8b5c-4c47-bede-ddfcf288c0ea","Type":"ContainerStarted","Data":"323d488f812c335d2a7bd8029ecef41396e1c35bc5b03b798d2ab0733a6a465d"} Apr 16 13:59:44.918489 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.918385 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" event={"ID":"b897edab-8b5c-4c47-bede-ddfcf288c0ea","Type":"ContainerStarted","Data":"4198f029c3464b5f8bb34758495dd2ba02a34c9857a102f5d46446c79d50d655"} Apr 16 13:59:44.918489 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.918397 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" event={"ID":"b897edab-8b5c-4c47-bede-ddfcf288c0ea","Type":"ContainerStarted","Data":"1a2ab763ec5ea89fda761a258aff6ce420722fa5cf23b2ed8345b15f9c7e458d"} Apr 16 13:59:44.918489 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.918410 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" event={"ID":"b897edab-8b5c-4c47-bede-ddfcf288c0ea","Type":"ContainerStarted","Data":"b8a37584d7b4574b88498fea2f739636ab9c87b20e62b35f70757cf7b0e341c0"} Apr 16 13:59:44.919442 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.919422 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" event={"ID":"e1cf92f2-1bbb-4464-acfa-20a13119b6f4","Type":"ContainerStarted","Data":"47839f6a12e1d3839ff4ddba448e9eb744fe4b2044782376a10d1dcb1179ecb9"} Apr 16 13:59:44.928282 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.928247 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7fsvb" podStartSLOduration=4.261058157 podStartE2EDuration="21.928237254s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.99724882 +0000 UTC m=+2.816614153" lastFinishedPulling="2026-04-16 13:59:43.664427907 +0000 UTC m=+20.483793250" observedRunningTime="2026-04-16 13:59:43.928351584 +0000 UTC m=+20.747716936" watchObservedRunningTime="2026-04-16 13:59:44.928237254 +0000 UTC m=+21.747602605" Apr 16 13:59:44.928662 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.928641 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-926vk" podStartSLOduration=4.264612737 podStartE2EDuration="21.928636862s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:26.006324798 +0000 UTC m=+2.825690132" lastFinishedPulling="2026-04-16 13:59:43.670348925 +0000 UTC m=+20.489714257" observedRunningTime="2026-04-16 13:59:44.928049579 +0000 UTC m=+21.747414930" watchObservedRunningTime="2026-04-16 13:59:44.928636862 +0000 UTC m=+21.748002213" Apr 16 13:59:44.944634 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.944600 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-m6p2z" podStartSLOduration=4.252504035 podStartE2EDuration="21.944589085s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.972038621 +0000 UTC m=+2.791403965" lastFinishedPulling="2026-04-16 13:59:43.664123669 +0000 UTC m=+20.483489015" observedRunningTime="2026-04-16 13:59:44.944342499 +0000 UTC m=+21.763707845" watchObservedRunningTime="2026-04-16 13:59:44.944589085 +0000 UTC m=+21.763954437" Apr 16 13:59:44.958499 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.958453 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zfvvf" podStartSLOduration=4.30553998 podStartE2EDuration="21.95844344s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.982814286 +0000 UTC m=+2.802179620" lastFinishedPulling="2026-04-16 13:59:43.635717747 +0000 UTC m=+20.455083080" observedRunningTime="2026-04-16 13:59:44.958430325 +0000 UTC m=+21.777795676" watchObservedRunningTime="2026-04-16 13:59:44.95844344 +0000 UTC m=+21.777808792" Apr 16 13:59:44.992661 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:44.992613 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-244.ec2.internal" podStartSLOduration=20.992598824 podStartE2EDuration="20.992598824s" podCreationTimestamp="2026-04-16 13:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:44.971230206 +0000 UTC m=+21.790595552" watchObservedRunningTime="2026-04-16 13:59:44.992598824 +0000 UTC m=+21.811964176" Apr 16 13:59:45.007500 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:45.007453 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qzz48" podStartSLOduration=4.341094171 podStartE2EDuration="22.007442967s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:26.000561844 +0000 UTC m=+2.819927198" lastFinishedPulling="2026-04-16 13:59:43.666910657 +0000 UTC m=+20.486275994" observedRunningTime="2026-04-16 13:59:45.006958618 +0000 UTC m=+21.826323972" watchObservedRunningTime="2026-04-16 13:59:45.007442967 +0000 UTC m=+21.826808319" Apr 16 13:59:45.124232 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:45.123984 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:45.124397 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:45.124137 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:45.124397 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:45.124347 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret podName:4f657495-2777-434f-9e5e-c076ada48605 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.12432583 +0000 UTC m=+29.943691161 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret") pod "global-pull-secret-syncer-5dpwb" (UID: "4f657495-2777-434f-9e5e-c076ada48605") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:45.594705 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:45.594684 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:45.739304 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:45.739150 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:45.594701336Z","UUID":"4d5725a7-4f23-4ea5-b093-fbe780c7a6d8","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:45.741656 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:45.741634 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:45.741656 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:45.741661 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:45.769679 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:45.769652 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:45.769679 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:45.769675 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:45.769850 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:45.769766 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:45.769900 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:45.769882 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:45.923143 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:45.923100 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" event={"ID":"e1cf92f2-1bbb-4464-acfa-20a13119b6f4","Type":"ContainerStarted","Data":"ceb45b8a908df8fad846adf0bdc3187cd5fdbbba0a6415f7b36cf6ca2b98f343"} Apr 16 13:59:46.769682 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:46.769644 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:46.769851 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:46.769784 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:46.783768 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:46.783736 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:46.784472 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:46.784445 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:46.799016 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:46.798976 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rnkfc" podStartSLOduration=6.173309439 podStartE2EDuration="23.798963238s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:26.010078748 +0000 UTC m=+2.829444092" lastFinishedPulling="2026-04-16 13:59:43.635732547 +0000 UTC m=+20.455097891" observedRunningTime="2026-04-16 13:59:45.021778808 +0000 UTC m=+21.841144159" watchObservedRunningTime="2026-04-16 13:59:46.798963238 +0000 UTC m=+23.618328584" Apr 16 13:59:46.928797 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:46.928764 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" event={"ID":"b897edab-8b5c-4c47-bede-ddfcf288c0ea","Type":"ContainerStarted","Data":"cd5ab28a79679592133be3bf0586ec442fa850e3bc1c4b3c71d141ec82491419"} Apr 16 13:59:47.769664 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:47.769629 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:47.769835 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:47.769671 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:47.769835 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:47.769748 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:47.769958 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:47.769881 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:47.933246 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:47.933217 2564 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:47.933246 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:47.933225 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" event={"ID":"e1cf92f2-1bbb-4464-acfa-20a13119b6f4","Type":"ContainerStarted","Data":"f9ec5d4a2cc2517ba5d10f56d2d39000bf3e4b1d3cc6ffa71bc90751aa449d0b"} Apr 16 13:59:47.951903 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:47.951856 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z7f" podStartSLOduration=4.008032465 podStartE2EDuration="24.951840011s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.96577104 +0000 UTC m=+2.785136371" lastFinishedPulling="2026-04-16 13:59:46.909578574 +0000 UTC m=+23.728943917" observedRunningTime="2026-04-16 13:59:47.951439389 +0000 UTC m=+24.770804743" watchObservedRunningTime="2026-04-16 13:59:47.951840011 +0000 UTC m=+24.771205385" Apr 16 13:59:48.769229 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:48.769187 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:48.769404 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:48.769314 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:49.769388 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:49.769214 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:49.769780 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:49.769230 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:49.769780 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:49.769459 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:49.769780 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:49.769549 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:49.940120 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:49.940022 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" event={"ID":"b897edab-8b5c-4c47-bede-ddfcf288c0ea","Type":"ContainerStarted","Data":"69df1bd1cfb4b247f3ac86ea95b3fc38790acf8744bb5817f9dbdd0309ed0a64"} Apr 16 13:59:49.940349 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:49.940331 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:49.940405 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:49.940359 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:49.954597 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:49.954573 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:49.973080 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:49.973029 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" podStartSLOduration=8.864111175 podStartE2EDuration="26.973011581s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.994134212 +0000 UTC m=+2.813499542" lastFinishedPulling="2026-04-16 13:59:44.1030346 +0000 UTC m=+20.922399948" observedRunningTime="2026-04-16 13:59:49.972124207 +0000 UTC m=+26.791489559" watchObservedRunningTime="2026-04-16 13:59:49.973011581 +0000 UTC m=+26.792376934" Apr 16 13:59:50.769844 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:50.769814 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:50.770266 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:50.769916 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:50.944442 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:50.944411 2564 generic.go:358] "Generic (PLEG): container finished" podID="7e8196db-0ad9-4936-a33e-c935a2815b53" containerID="d8b06a43c57de7d33594dc1987a849e7f85a6c1590358735f9897d7eeb41d60d" exitCode=0 Apr 16 13:59:50.944638 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:50.944484 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" event={"ID":"7e8196db-0ad9-4936-a33e-c935a2815b53","Type":"ContainerDied","Data":"d8b06a43c57de7d33594dc1987a849e7f85a6c1590358735f9897d7eeb41d60d"} Apr 16 13:59:50.945783 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:50.945219 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:50.960246 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:50.960226 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 13:59:51.650029 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:51.650001 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mmfcz"] Apr 16 13:59:51.650158 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:51.650144 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:51.650294 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:51.650271 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:51.654035 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:51.653992 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5g5kq"] Apr 16 13:59:51.654162 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:51.654125 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:51.654308 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:51.654263 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:51.654681 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:51.654662 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5dpwb"] Apr 16 13:59:51.654782 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:51.654768 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:51.654880 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:51.654858 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:51.948525 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:51.948294 2564 generic.go:358] "Generic (PLEG): container finished" podID="7e8196db-0ad9-4936-a33e-c935a2815b53" containerID="71e99a6d253851caeedde7e908cd712fc68c64030d412b4f4b611b2edf2ee43e" exitCode=0 Apr 16 13:59:51.948525 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:51.948369 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" event={"ID":"7e8196db-0ad9-4936-a33e-c935a2815b53","Type":"ContainerDied","Data":"71e99a6d253851caeedde7e908cd712fc68c64030d412b4f4b611b2edf2ee43e"} Apr 16 13:59:52.769605 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:52.769575 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:52.769805 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:52.769687 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:53.191698 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:53.191649 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:53.192126 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:53.191815 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:53.192126 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:53.191906 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret podName:4f657495-2777-434f-9e5e-c076ada48605 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.19187926 +0000 UTC m=+46.011244591 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret") pod "global-pull-secret-syncer-5dpwb" (UID: "4f657495-2777-434f-9e5e-c076ada48605") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:53.769950 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:53.769913 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:53.770141 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:53.770011 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:53.770141 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:53.770017 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:53.770141 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:53.770104 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:53.956948 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:53.956912 2564 generic.go:358] "Generic (PLEG): container finished" podID="7e8196db-0ad9-4936-a33e-c935a2815b53" containerID="88e6856378b22d8ada80f4f551b5efa675b437521e4db4928feef4dc86fb8615" exitCode=0 Apr 16 13:59:53.957093 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:53.956963 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" event={"ID":"7e8196db-0ad9-4936-a33e-c935a2815b53","Type":"ContainerDied","Data":"88e6856378b22d8ada80f4f551b5efa675b437521e4db4928feef4dc86fb8615"} Apr 16 13:59:54.769307 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:54.769269 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:54.769964 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:54.769386 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5dpwb" podUID="4f657495-2777-434f-9e5e-c076ada48605" Apr 16 13:59:55.769388 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:55.769350 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:55.769809 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:55.769358 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:55.769809 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:55.769501 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g5kq" podUID="2bdb9ab9-1a72-487e-8b6c-732d544d0454" Apr 16 13:59:55.769809 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:55.769587 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mmfcz" podUID="1ec67a7e-5f50-42d0-b878-f6ddc3826470" Apr 16 13:59:55.796084 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:55.796041 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:55.796290 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:55.796223 2564 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:55.797013 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:55.796989 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rnkfc" Apr 16 13:59:56.461951 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.461923 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-244.ec2.internal" event="NodeReady" Apr 16 13:59:56.462119 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.462072 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:56.508756 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.508723 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gmqgq"] Apr 16 13:59:56.568402 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.568373 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t7d2w"] Apr 16 13:59:56.568603 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.568576 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.574022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.573999 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5ngv\"" Apr 16 13:59:56.574022 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.574017 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:56.574244 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.574000 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:56.588167 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.588143 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gmqgq"] Apr 16 13:59:56.588298 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.588174 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t7d2w"] Apr 16 13:59:56.588298 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.588222 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 13:59:56.593165 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.593144 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-r2bfh\"" Apr 16 13:59:56.593165 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.593159 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:56.593337 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.593215 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:56.593337 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.593233 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:56.720582 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.720490 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99722c79-e7f0-4687-85aa-06ca3ad840a2-config-volume\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.720582 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.720564 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.720786 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.720612 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99722c79-e7f0-4687-85aa-06ca3ad840a2-tmp-dir\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.720786 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.720669 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpnd2\" (UniqueName: \"kubernetes.io/projected/99722c79-e7f0-4687-85aa-06ca3ad840a2-kube-api-access-hpnd2\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.720786 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.720748 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 13:59:56.720900 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.720792 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j88wp\" (UniqueName: \"kubernetes.io/projected/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-kube-api-access-j88wp\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 13:59:56.769916 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.769885 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 13:59:56.772519 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.772498 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:59:56.821481 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.821447 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99722c79-e7f0-4687-85aa-06ca3ad840a2-config-volume\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.821641 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.821594 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.821712 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.821653 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99722c79-e7f0-4687-85aa-06ca3ad840a2-tmp-dir\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.821712 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.821682 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpnd2\" (UniqueName: \"kubernetes.io/projected/99722c79-e7f0-4687-85aa-06ca3ad840a2-kube-api-access-hpnd2\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.821712 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.821707 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 13:59:56.821853 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.821733 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j88wp\" (UniqueName: \"kubernetes.io/projected/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-kube-api-access-j88wp\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 13:59:56.821853 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:56.821762 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:56.821853 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:56.821834 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls podName:99722c79-e7f0-4687-85aa-06ca3ad840a2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.321813507 +0000 UTC m=+34.141178846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls") pod "dns-default-gmqgq" (UID: "99722c79-e7f0-4687-85aa-06ca3ad840a2") : secret "dns-default-metrics-tls" not found Apr 16 13:59:56.822009 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:56.821968 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:56.822063 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.822028 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99722c79-e7f0-4687-85aa-06ca3ad840a2-tmp-dir\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.822063 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.822052 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99722c79-e7f0-4687-85aa-06ca3ad840a2-config-volume\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:56.822153 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:56.822074 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert podName:68dcca7e-5f78-4645-b5f2-2e8fe47395cc nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.322055035 +0000 UTC m=+34.141420378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert") pod "ingress-canary-t7d2w" (UID: "68dcca7e-5f78-4645-b5f2-2e8fe47395cc") : secret "canary-serving-cert" not found Apr 16 13:59:56.833959 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.833931 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j88wp\" (UniqueName: \"kubernetes.io/projected/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-kube-api-access-j88wp\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 13:59:56.834386 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:56.834360 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpnd2\" (UniqueName: \"kubernetes.io/projected/99722c79-e7f0-4687-85aa-06ca3ad840a2-kube-api-access-hpnd2\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:57.325943 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.325907 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:57.326169 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.325977 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 13:59:57.326169 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.326069 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:57.326169 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.326072 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:57.326169 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.326123 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert podName:68dcca7e-5f78-4645-b5f2-2e8fe47395cc nodeName:}" failed. No retries permitted until 2026-04-16 13:59:58.326109355 +0000 UTC m=+35.145474685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert") pod "ingress-canary-t7d2w" (UID: "68dcca7e-5f78-4645-b5f2-2e8fe47395cc") : secret "canary-serving-cert" not found Apr 16 13:59:57.326169 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.326135 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls podName:99722c79-e7f0-4687-85aa-06ca3ad840a2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:58.326129576 +0000 UTC m=+35.145494907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls") pod "dns-default-gmqgq" (UID: "99722c79-e7f0-4687-85aa-06ca3ad840a2") : secret "dns-default-metrics-tls" not found Apr 16 13:59:57.426656 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.426611 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:57.426854 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.426758 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:57.426854 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.426821 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs podName:2bdb9ab9-1a72-487e-8b6c-732d544d0454 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:29.426806713 +0000 UTC m=+66.246172044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs") pod "network-metrics-daemon-5g5kq" (UID: "2bdb9ab9-1a72-487e-8b6c-732d544d0454") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:57.528007 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.527973 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:57.528177 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.528152 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:57.528177 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.528176 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:57.528285 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.528186 2564 projected.go:194] Error preparing data for projected volume kube-api-access-gs9m5 for pod openshift-network-diagnostics/network-check-target-mmfcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:57.528285 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:57.528259 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5 podName:1ec67a7e-5f50-42d0-b878-f6ddc3826470 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:29.52823862 +0000 UTC m=+66.347603950 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs9m5" (UniqueName: "kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5") pod "network-check-target-mmfcz" (UID: "1ec67a7e-5f50-42d0-b878-f6ddc3826470") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:57.770058 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.769980 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 13:59:57.770058 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.770018 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 13:59:57.774147 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.774127 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:57.774147 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.774140 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m8ds6\"" Apr 16 13:59:57.774352 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.774153 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:57.774352 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.774131 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2qtwk\"" Apr 16 13:59:57.774352 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:57.774131 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:58.333925 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:58.333889 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 13:59:58.334217 ip-10-0-140-244 kubenswrapper[2564]: I0416 13:59:58.333956 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 13:59:58.334217 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:58.334053 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:58.334217 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:58.334077 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:58.334217 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:58.334133 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls podName:99722c79-e7f0-4687-85aa-06ca3ad840a2 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:00.334116446 +0000 UTC m=+37.153481776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls") pod "dns-default-gmqgq" (UID: "99722c79-e7f0-4687-85aa-06ca3ad840a2") : secret "dns-default-metrics-tls" not found Apr 16 13:59:58.334217 ip-10-0-140-244 kubenswrapper[2564]: E0416 13:59:58.334149 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert podName:68dcca7e-5f78-4645-b5f2-2e8fe47395cc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:00.334143452 +0000 UTC m=+37.153508781 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert") pod "ingress-canary-t7d2w" (UID: "68dcca7e-5f78-4645-b5f2-2e8fe47395cc") : secret "canary-serving-cert" not found Apr 16 14:00:00.349936 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:00.349711 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 14:00:00.349936 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:00.349909 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 14:00:00.350342 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:00.349863 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:00.350342 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:00.349985 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:00.350342 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:00.350008 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls podName:99722c79-e7f0-4687-85aa-06ca3ad840a2 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:04.349992115 +0000 UTC m=+41.169357445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls") pod "dns-default-gmqgq" (UID: "99722c79-e7f0-4687-85aa-06ca3ad840a2") : secret "dns-default-metrics-tls" not found Apr 16 14:00:00.350342 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:00.350023 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert podName:68dcca7e-5f78-4645-b5f2-2e8fe47395cc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:04.350016387 +0000 UTC m=+41.169381717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert") pod "ingress-canary-t7d2w" (UID: "68dcca7e-5f78-4645-b5f2-2e8fe47395cc") : secret "canary-serving-cert" not found Apr 16 14:00:00.972547 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:00.972514 2564 generic.go:358] "Generic (PLEG): container finished" podID="7e8196db-0ad9-4936-a33e-c935a2815b53" containerID="aa7257586febf9e4aa99af43781d360c5263629d243c3a2ede1200221d19bd91" exitCode=0 Apr 16 14:00:00.972725 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:00.972572 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" event={"ID":"7e8196db-0ad9-4936-a33e-c935a2815b53","Type":"ContainerDied","Data":"aa7257586febf9e4aa99af43781d360c5263629d243c3a2ede1200221d19bd91"} Apr 16 14:00:01.976615 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:01.976582 2564 generic.go:358] "Generic (PLEG): container finished" podID="7e8196db-0ad9-4936-a33e-c935a2815b53" containerID="30b00079cf8a7aca627ab4eaaa9ddec70a764149aa33cf167daf3f9de80547bd" exitCode=0 Apr 16 14:00:01.977049 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:01.976639 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" event={"ID":"7e8196db-0ad9-4936-a33e-c935a2815b53","Type":"ContainerDied","Data":"30b00079cf8a7aca627ab4eaaa9ddec70a764149aa33cf167daf3f9de80547bd"} Apr 16 14:00:02.980606 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:02.980570 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" event={"ID":"7e8196db-0ad9-4936-a33e-c935a2815b53","Type":"ContainerStarted","Data":"a6a70b6b0af9dc21175a074185f7073897107564a5c16b629fb6e776303722e0"} Apr 16 14:00:03.006741 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:03.006687 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mxtjl" podStartSLOduration=6.138861418 podStartE2EDuration="40.00667382s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.977615384 +0000 UTC m=+2.796980731" lastFinishedPulling="2026-04-16 13:59:59.845427802 +0000 UTC m=+36.664793133" observedRunningTime="2026-04-16 14:00:03.005627634 +0000 UTC m=+39.824992987" watchObservedRunningTime="2026-04-16 14:00:03.00667382 +0000 UTC m=+39.826039172" Apr 16 14:00:04.377457 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:04.377425 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 14:00:04.377800 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:04.377505 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 14:00:04.377800 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:04.377574 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:04.377800 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:04.377637 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert podName:68dcca7e-5f78-4645-b5f2-2e8fe47395cc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:12.377620367 +0000 UTC m=+49.196985701 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert") pod "ingress-canary-t7d2w" (UID: "68dcca7e-5f78-4645-b5f2-2e8fe47395cc") : secret "canary-serving-cert" not found Apr 16 14:00:04.377800 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:04.377585 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:04.377800 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:04.377696 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls podName:99722c79-e7f0-4687-85aa-06ca3ad840a2 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:12.377685277 +0000 UTC m=+49.197050607 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls") pod "dns-default-gmqgq" (UID: "99722c79-e7f0-4687-85aa-06ca3ad840a2") : secret "dns-default-metrics-tls" not found Apr 16 14:00:09.214062 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:09.214023 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 14:00:09.217163 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:09.217132 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f657495-2777-434f-9e5e-c076ada48605-original-pull-secret\") pod \"global-pull-secret-syncer-5dpwb\" (UID: \"4f657495-2777-434f-9e5e-c076ada48605\") " pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 14:00:09.380078 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:09.380038 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5dpwb" Apr 16 14:00:09.532287 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:09.532256 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5dpwb"] Apr 16 14:00:09.535712 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:00:09.535667 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f657495_2777_434f_9e5e_c076ada48605.slice/crio-ab31cb3d5f22676b9b9166d3f5d49c29115f49f50d2ae44921578be9a52cd6b7 WatchSource:0}: Error finding container ab31cb3d5f22676b9b9166d3f5d49c29115f49f50d2ae44921578be9a52cd6b7: Status 404 returned error can't find the container with id ab31cb3d5f22676b9b9166d3f5d49c29115f49f50d2ae44921578be9a52cd6b7 Apr 16 14:00:09.993220 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:09.993182 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5dpwb" event={"ID":"4f657495-2777-434f-9e5e-c076ada48605","Type":"ContainerStarted","Data":"ab31cb3d5f22676b9b9166d3f5d49c29115f49f50d2ae44921578be9a52cd6b7"} Apr 16 14:00:12.435247 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:12.435195 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 14:00:12.435646 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:12.435302 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 14:00:12.435646 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:12.435366 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:12.435646 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:12.435393 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:12.435646 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:12.435448 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert podName:68dcca7e-5f78-4645-b5f2-2e8fe47395cc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:28.435426945 +0000 UTC m=+65.254792276 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert") pod "ingress-canary-t7d2w" (UID: "68dcca7e-5f78-4645-b5f2-2e8fe47395cc") : secret "canary-serving-cert" not found Apr 16 14:00:12.435646 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:12.435470 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls podName:99722c79-e7f0-4687-85aa-06ca3ad840a2 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:28.435460965 +0000 UTC m=+65.254826296 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls") pod "dns-default-gmqgq" (UID: "99722c79-e7f0-4687-85aa-06ca3ad840a2") : secret "dns-default-metrics-tls" not found Apr 16 14:00:15.003244 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:15.003191 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5dpwb" event={"ID":"4f657495-2777-434f-9e5e-c076ada48605","Type":"ContainerStarted","Data":"aac103f41e37b3bb0fb00b2499c2b117a77479b08367efe2a3b05abbdb642845"} Apr 16 14:00:15.040188 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:15.040135 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5dpwb" podStartSLOduration=33.652012774 podStartE2EDuration="38.040119914s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 14:00:09.537286661 +0000 UTC m=+46.356651994" lastFinishedPulling="2026-04-16 14:00:13.92539379 +0000 UTC m=+50.744759134" observedRunningTime="2026-04-16 14:00:15.039782659 +0000 UTC m=+51.859148012" watchObservedRunningTime="2026-04-16 14:00:15.040119914 +0000 UTC m=+51.859485294" Apr 16 14:00:22.964523 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:22.964492 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cc429" Apr 16 14:00:28.441763 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:28.441725 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 14:00:28.442126 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:28.441789 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 14:00:28.442126 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:28.441870 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:28.442126 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:28.441871 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:28.442126 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:28.441924 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert podName:68dcca7e-5f78-4645-b5f2-2e8fe47395cc nodeName:}" failed. No retries permitted until 2026-04-16 14:01:00.441908899 +0000 UTC m=+97.261274228 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert") pod "ingress-canary-t7d2w" (UID: "68dcca7e-5f78-4645-b5f2-2e8fe47395cc") : secret "canary-serving-cert" not found Apr 16 14:00:28.442126 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:28.441938 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls podName:99722c79-e7f0-4687-85aa-06ca3ad840a2 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:00.441931475 +0000 UTC m=+97.261296806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls") pod "dns-default-gmqgq" (UID: "99722c79-e7f0-4687-85aa-06ca3ad840a2") : secret "dns-default-metrics-tls" not found Apr 16 14:00:29.448517 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:29.448480 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 14:00:29.451104 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:29.451089 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:29.459374 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:29.459359 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:29.459437 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:00:29.459414 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs podName:2bdb9ab9-1a72-487e-8b6c-732d544d0454 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:33.459399643 +0000 UTC m=+130.278764972 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs") pod "network-metrics-daemon-5g5kq" (UID: "2bdb9ab9-1a72-487e-8b6c-732d544d0454") : secret "metrics-daemon-secret" not found Apr 16 14:00:29.548863 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:29.548832 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 14:00:29.551842 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:29.551826 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:29.561559 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:29.561537 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:29.572209 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:29.572183 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs9m5\" (UniqueName: \"kubernetes.io/projected/1ec67a7e-5f50-42d0-b878-f6ddc3826470-kube-api-access-gs9m5\") pod \"network-check-target-mmfcz\" (UID: \"1ec67a7e-5f50-42d0-b878-f6ddc3826470\") " pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 14:00:29.590758 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:29.590733 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m8ds6\"" Apr 16 14:00:29.599110 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:29.599091 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 14:00:29.712976 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:29.712905 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mmfcz"] Apr 16 14:00:29.716802 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:00:29.716776 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ec67a7e_5f50_42d0_b878_f6ddc3826470.slice/crio-b770e1b0d9e837b9a4a3e1655ec047af8c0a994f3ad88559ac6671e87b2d7241 WatchSource:0}: Error finding container b770e1b0d9e837b9a4a3e1655ec047af8c0a994f3ad88559ac6671e87b2d7241: Status 404 returned error can't find the container with id b770e1b0d9e837b9a4a3e1655ec047af8c0a994f3ad88559ac6671e87b2d7241 Apr 16 14:00:30.032286 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:30.032176 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mmfcz" event={"ID":"1ec67a7e-5f50-42d0-b878-f6ddc3826470","Type":"ContainerStarted","Data":"b770e1b0d9e837b9a4a3e1655ec047af8c0a994f3ad88559ac6671e87b2d7241"} Apr 16 14:00:33.039278 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:33.039242 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mmfcz" event={"ID":"1ec67a7e-5f50-42d0-b878-f6ddc3826470","Type":"ContainerStarted","Data":"5a483121acbfba35b82500e673b2931ca05cda98f2a93ceb90b48bc1eb74874f"} Apr 16 14:00:33.039669 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:33.039374 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 14:00:33.055103 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:00:33.055059 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mmfcz" podStartSLOduration=67.233040415 podStartE2EDuration="1m10.055046421s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 14:00:29.719066471 +0000 UTC m=+66.538431815" lastFinishedPulling="2026-04-16 14:00:32.541072487 +0000 UTC m=+69.360437821" observedRunningTime="2026-04-16 14:00:33.054022573 +0000 UTC m=+69.873387940" watchObservedRunningTime="2026-04-16 14:00:33.055046421 +0000 UTC m=+69.874411769" Apr 16 14:01:00.464977 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:00.464836 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 14:01:00.464977 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:00.464895 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 14:01:00.465456 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:00.464992 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:01:00.465456 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:00.465006 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:01:00.465456 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:00.465053 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert podName:68dcca7e-5f78-4645-b5f2-2e8fe47395cc nodeName:}" failed. No retries permitted until 2026-04-16 14:02:04.465038654 +0000 UTC m=+161.284403987 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert") pod "ingress-canary-t7d2w" (UID: "68dcca7e-5f78-4645-b5f2-2e8fe47395cc") : secret "canary-serving-cert" not found Apr 16 14:01:00.465456 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:00.465074 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls podName:99722c79-e7f0-4687-85aa-06ca3ad840a2 nodeName:}" failed. No retries permitted until 2026-04-16 14:02:04.465058583 +0000 UTC m=+161.284423916 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls") pod "dns-default-gmqgq" (UID: "99722c79-e7f0-4687-85aa-06ca3ad840a2") : secret "dns-default-metrics-tls" not found Apr 16 14:01:04.043751 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:04.043720 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mmfcz" Apr 16 14:01:15.810240 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.810183 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w"] Apr 16 14:01:15.812231 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.812192 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w" Apr 16 14:01:15.813186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.813166 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-54469996f6-6h8nf"] Apr 16 14:01:15.814918 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.814900 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 14:01:15.815100 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.815085 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.815812 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.815790 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:01:15.815812 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.815811 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-7bnxw\"" Apr 16 14:01:15.817732 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.817707 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 14:01:15.817937 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.817881 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:01:15.818006 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.817977 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:01:15.818057 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.818016 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 14:01:15.818057 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.818030 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-8wzjx\"" Apr 16 14:01:15.818341 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.818324 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 14:01:15.818417 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.818355 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 14:01:15.825573 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.825553 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w"] Apr 16 14:01:15.830975 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.830948 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-54469996f6-6h8nf"] Apr 16 14:01:15.863976 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.863944 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.864149 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.863986 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drgk5\" (UniqueName: \"kubernetes.io/projected/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-kube-api-access-drgk5\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.864149 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.864064 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nwgm\" (UniqueName: \"kubernetes.io/projected/2ef571a0-26fd-4b7c-a9c2-36cadb5a6ce3-kube-api-access-5nwgm\") pod \"volume-data-source-validator-7d955d5dd4-stk8w\" (UID: \"2ef571a0-26fd-4b7c-a9c2-36cadb5a6ce3\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w" Apr 16 14:01:15.864149 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.864099 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-default-certificate\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.864149 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.864121 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-stats-auth\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.864304 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.864218 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.913673 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.913639 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv"] Apr 16 14:01:15.915569 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.915553 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:15.918242 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.918220 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 14:01:15.918361 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.918226 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:01:15.918361 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.918270 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 14:01:15.918587 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.918569 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-pm2rc\"" Apr 16 14:01:15.918748 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.918730 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 14:01:15.933814 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.933790 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv"] Apr 16 14:01:15.965489 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.965454 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.965660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.965505 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.965660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.965536 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ab2e12-9693-40bc-9dd6-fc8e26d75737-serving-cert\") pod \"service-ca-operator-69965bb79d-bdglv\" (UID: \"e4ab2e12-9693-40bc-9dd6-fc8e26d75737\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:15.965660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.965563 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ab2e12-9693-40bc-9dd6-fc8e26d75737-config\") pod \"service-ca-operator-69965bb79d-bdglv\" (UID: \"e4ab2e12-9693-40bc-9dd6-fc8e26d75737\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:15.965660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.965598 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drgk5\" (UniqueName: \"kubernetes.io/projected/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-kube-api-access-drgk5\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.965660 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:15.965621 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:01:15.965660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.965632 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nwgm\" (UniqueName: \"kubernetes.io/projected/2ef571a0-26fd-4b7c-a9c2-36cadb5a6ce3-kube-api-access-5nwgm\") pod \"volume-data-source-validator-7d955d5dd4-stk8w\" (UID: \"2ef571a0-26fd-4b7c-a9c2-36cadb5a6ce3\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w" Apr 16 14:01:15.965660 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:15.965647 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:16.465625472 +0000 UTC m=+113.284990816 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : configmap references non-existent config key: service-ca.crt Apr 16 14:01:15.966011 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:15.965696 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:16.46567937 +0000 UTC m=+113.285044705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : secret "router-metrics-certs-default" not found Apr 16 14:01:15.966011 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.965726 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-default-certificate\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.966011 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.965754 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-stats-auth\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.966011 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.965777 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npmsl\" (UniqueName: \"kubernetes.io/projected/e4ab2e12-9693-40bc-9dd6-fc8e26d75737-kube-api-access-npmsl\") pod \"service-ca-operator-69965bb79d-bdglv\" (UID: \"e4ab2e12-9693-40bc-9dd6-fc8e26d75737\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:15.968217 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.968182 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-default-certificate\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.968351 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.968231 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-stats-auth\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.975645 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.975619 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drgk5\" (UniqueName: \"kubernetes.io/projected/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-kube-api-access-drgk5\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:15.975737 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:15.975670 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nwgm\" (UniqueName: \"kubernetes.io/projected/2ef571a0-26fd-4b7c-a9c2-36cadb5a6ce3-kube-api-access-5nwgm\") pod \"volume-data-source-validator-7d955d5dd4-stk8w\" (UID: \"2ef571a0-26fd-4b7c-a9c2-36cadb5a6ce3\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w" Apr 16 14:01:16.066823 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.066724 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ab2e12-9693-40bc-9dd6-fc8e26d75737-serving-cert\") pod \"service-ca-operator-69965bb79d-bdglv\" (UID: \"e4ab2e12-9693-40bc-9dd6-fc8e26d75737\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:16.066823 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.066767 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ab2e12-9693-40bc-9dd6-fc8e26d75737-config\") pod \"service-ca-operator-69965bb79d-bdglv\" (UID: \"e4ab2e12-9693-40bc-9dd6-fc8e26d75737\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:16.066823 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.066817 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npmsl\" (UniqueName: \"kubernetes.io/projected/e4ab2e12-9693-40bc-9dd6-fc8e26d75737-kube-api-access-npmsl\") pod \"service-ca-operator-69965bb79d-bdglv\" (UID: \"e4ab2e12-9693-40bc-9dd6-fc8e26d75737\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:16.067394 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.067372 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ab2e12-9693-40bc-9dd6-fc8e26d75737-config\") pod \"service-ca-operator-69965bb79d-bdglv\" (UID: \"e4ab2e12-9693-40bc-9dd6-fc8e26d75737\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:16.069209 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.069177 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ab2e12-9693-40bc-9dd6-fc8e26d75737-serving-cert\") pod \"service-ca-operator-69965bb79d-bdglv\" (UID: \"e4ab2e12-9693-40bc-9dd6-fc8e26d75737\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:16.077874 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.077851 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npmsl\" (UniqueName: \"kubernetes.io/projected/e4ab2e12-9693-40bc-9dd6-fc8e26d75737-kube-api-access-npmsl\") pod \"service-ca-operator-69965bb79d-bdglv\" (UID: \"e4ab2e12-9693-40bc-9dd6-fc8e26d75737\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:16.123050 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.123024 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w" Apr 16 14:01:16.224529 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.224500 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" Apr 16 14:01:16.236471 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.236449 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w"] Apr 16 14:01:16.239390 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:16.239359 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ef571a0_26fd_4b7c_a9c2_36cadb5a6ce3.slice/crio-857ce05f2fde3b03225ce5c1586f37778575733980e7897458d26f1610be0899 WatchSource:0}: Error finding container 857ce05f2fde3b03225ce5c1586f37778575733980e7897458d26f1610be0899: Status 404 returned error can't find the container with id 857ce05f2fde3b03225ce5c1586f37778575733980e7897458d26f1610be0899 Apr 16 14:01:16.336557 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.336502 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv"] Apr 16 14:01:16.339906 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:16.339882 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ab2e12_9693_40bc_9dd6_fc8e26d75737.slice/crio-07f05f7203ed0492cc96eb6df5cbecec7aec1324176b5488d02fb4468d2c91e0 WatchSource:0}: Error finding container 07f05f7203ed0492cc96eb6df5cbecec7aec1324176b5488d02fb4468d2c91e0: Status 404 returned error can't find the container with id 07f05f7203ed0492cc96eb6df5cbecec7aec1324176b5488d02fb4468d2c91e0 Apr 16 14:01:16.471548 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.471513 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:16.471548 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:16.471562 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:16.471775 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:16.471668 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:01:16.471775 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:16.471695 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:17.471677793 +0000 UTC m=+114.291043128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : configmap references non-existent config key: service-ca.crt Apr 16 14:01:16.471775 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:16.471728 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:17.471719345 +0000 UTC m=+114.291084675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : secret "router-metrics-certs-default" not found Apr 16 14:01:17.122005 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:17.121962 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" event={"ID":"e4ab2e12-9693-40bc-9dd6-fc8e26d75737","Type":"ContainerStarted","Data":"07f05f7203ed0492cc96eb6df5cbecec7aec1324176b5488d02fb4468d2c91e0"} Apr 16 14:01:17.122995 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:17.122967 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w" event={"ID":"2ef571a0-26fd-4b7c-a9c2-36cadb5a6ce3","Type":"ContainerStarted","Data":"857ce05f2fde3b03225ce5c1586f37778575733980e7897458d26f1610be0899"} Apr 16 14:01:17.479963 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:17.479865 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:17.479963 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:17.479926 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:17.480163 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:17.480083 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:01:17.480163 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:17.480088 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:19.480062725 +0000 UTC m=+116.299428057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : configmap references non-existent config key: service-ca.crt Apr 16 14:01:17.480269 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:17.480179 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:19.480166007 +0000 UTC m=+116.299531337 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : secret "router-metrics-certs-default" not found Apr 16 14:01:18.125925 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:18.125834 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" event={"ID":"e4ab2e12-9693-40bc-9dd6-fc8e26d75737","Type":"ContainerStarted","Data":"392975e570554e3580557b99ba8f831ce8f90c3cc22afcf1b0918b3f09c76cd1"} Apr 16 14:01:18.143422 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:18.143351 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" podStartSLOduration=1.647823975 podStartE2EDuration="3.143338007s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.341552082 +0000 UTC m=+113.160917425" lastFinishedPulling="2026-04-16 14:01:17.837066122 +0000 UTC m=+114.656431457" observedRunningTime="2026-04-16 14:01:18.142526322 +0000 UTC m=+114.961891671" watchObservedRunningTime="2026-04-16 14:01:18.143338007 +0000 UTC m=+114.962703364" Apr 16 14:01:19.494862 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:19.494821 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:19.495317 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:19.494904 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:19.495317 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:19.494966 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:01:19.495317 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:19.495008 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:23.49499446 +0000 UTC m=+120.314359790 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : configmap references non-existent config key: service-ca.crt Apr 16 14:01:19.495317 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:19.495023 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:23.495016659 +0000 UTC m=+120.314381989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : secret "router-metrics-certs-default" not found Apr 16 14:01:20.579351 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.579322 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt"] Apr 16 14:01:20.581318 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.581292 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt" Apr 16 14:01:20.583967 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.583936 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-9bkgs\"" Apr 16 14:01:20.584927 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.584897 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 14:01:20.585032 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.584933 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 14:01:20.593267 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.593246 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt"] Apr 16 14:01:20.703627 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.703596 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4j5\" (UniqueName: \"kubernetes.io/projected/980778ce-1a4a-40fb-ac6b-c6ce367030e2-kube-api-access-9t4j5\") pod \"migrator-64d4d94569-g66gt\" (UID: \"980778ce-1a4a-40fb-ac6b-c6ce367030e2\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt" Apr 16 14:01:20.804299 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.804254 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4j5\" (UniqueName: \"kubernetes.io/projected/980778ce-1a4a-40fb-ac6b-c6ce367030e2-kube-api-access-9t4j5\") pod \"migrator-64d4d94569-g66gt\" (UID: \"980778ce-1a4a-40fb-ac6b-c6ce367030e2\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt" Apr 16 14:01:20.816041 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.816012 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4j5\" (UniqueName: \"kubernetes.io/projected/980778ce-1a4a-40fb-ac6b-c6ce367030e2-kube-api-access-9t4j5\") pod \"migrator-64d4d94569-g66gt\" (UID: \"980778ce-1a4a-40fb-ac6b-c6ce367030e2\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt" Apr 16 14:01:20.892284 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:20.892183 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt" Apr 16 14:01:21.096244 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:21.095617 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt"] Apr 16 14:01:21.100390 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:21.100346 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod980778ce_1a4a_40fb_ac6b_c6ce367030e2.slice/crio-e0b2c299aeac2fc35f03d38a727b48848de9a6862b6a0ac27ab29f842732cc87 WatchSource:0}: Error finding container e0b2c299aeac2fc35f03d38a727b48848de9a6862b6a0ac27ab29f842732cc87: Status 404 returned error can't find the container with id e0b2c299aeac2fc35f03d38a727b48848de9a6862b6a0ac27ab29f842732cc87 Apr 16 14:01:21.132300 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:21.132263 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w" event={"ID":"2ef571a0-26fd-4b7c-a9c2-36cadb5a6ce3","Type":"ContainerStarted","Data":"a161dd894141ccde59b15b7cc4ed5728c439ca784bdd0e969af82a9128c7f0b6"} Apr 16 14:01:21.133294 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:21.133272 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt" event={"ID":"980778ce-1a4a-40fb-ac6b-c6ce367030e2","Type":"ContainerStarted","Data":"e0b2c299aeac2fc35f03d38a727b48848de9a6862b6a0ac27ab29f842732cc87"} Apr 16 14:01:21.147500 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:21.147402 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-stk8w" podStartSLOduration=1.384044061 podStartE2EDuration="6.147384598s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.241513783 +0000 UTC m=+113.060879119" lastFinishedPulling="2026-04-16 14:01:21.004854325 +0000 UTC m=+117.824219656" observedRunningTime="2026-04-16 14:01:21.146628507 +0000 UTC m=+117.965993858" watchObservedRunningTime="2026-04-16 14:01:21.147384598 +0000 UTC m=+117.966749951" Apr 16 14:01:22.150500 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.150389 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-7wgs8"] Apr 16 14:01:22.152294 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.152276 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.154682 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.154658 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:01:22.155677 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.155653 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:01:22.155825 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.155718 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:01:22.155825 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.155739 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:01:22.155825 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.155746 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bm579\"" Apr 16 14:01:22.164508 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.164484 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-7wgs8"] Apr 16 14:01:22.213881 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.213852 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/81c9a1b1-4671-4ced-8400-416e0dfb4b49-signing-key\") pod \"service-ca-bfc587fb7-7wgs8\" (UID: \"81c9a1b1-4671-4ced-8400-416e0dfb4b49\") " pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.214039 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.213936 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/81c9a1b1-4671-4ced-8400-416e0dfb4b49-signing-cabundle\") pod \"service-ca-bfc587fb7-7wgs8\" (UID: \"81c9a1b1-4671-4ced-8400-416e0dfb4b49\") " pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.214102 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.214047 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwhwk\" (UniqueName: \"kubernetes.io/projected/81c9a1b1-4671-4ced-8400-416e0dfb4b49-kube-api-access-vwhwk\") pod \"service-ca-bfc587fb7-7wgs8\" (UID: \"81c9a1b1-4671-4ced-8400-416e0dfb4b49\") " pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.314842 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.314809 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/81c9a1b1-4671-4ced-8400-416e0dfb4b49-signing-cabundle\") pod \"service-ca-bfc587fb7-7wgs8\" (UID: \"81c9a1b1-4671-4ced-8400-416e0dfb4b49\") " pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.314994 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.314884 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwhwk\" (UniqueName: \"kubernetes.io/projected/81c9a1b1-4671-4ced-8400-416e0dfb4b49-kube-api-access-vwhwk\") pod \"service-ca-bfc587fb7-7wgs8\" (UID: \"81c9a1b1-4671-4ced-8400-416e0dfb4b49\") " pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.314994 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.314939 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/81c9a1b1-4671-4ced-8400-416e0dfb4b49-signing-key\") pod \"service-ca-bfc587fb7-7wgs8\" (UID: \"81c9a1b1-4671-4ced-8400-416e0dfb4b49\") " pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.315603 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.315580 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/81c9a1b1-4671-4ced-8400-416e0dfb4b49-signing-cabundle\") pod \"service-ca-bfc587fb7-7wgs8\" (UID: \"81c9a1b1-4671-4ced-8400-416e0dfb4b49\") " pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.317407 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.317386 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/81c9a1b1-4671-4ced-8400-416e0dfb4b49-signing-key\") pod \"service-ca-bfc587fb7-7wgs8\" (UID: \"81c9a1b1-4671-4ced-8400-416e0dfb4b49\") " pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.323739 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.323717 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwhwk\" (UniqueName: \"kubernetes.io/projected/81c9a1b1-4671-4ced-8400-416e0dfb4b49-kube-api-access-vwhwk\") pod \"service-ca-bfc587fb7-7wgs8\" (UID: \"81c9a1b1-4671-4ced-8400-416e0dfb4b49\") " pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.461649 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.461554 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" Apr 16 14:01:22.577033 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:22.576990 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-7wgs8"] Apr 16 14:01:22.580452 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:22.580429 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c9a1b1_4671_4ced_8400_416e0dfb4b49.slice/crio-e3868b929fbe7acbb20767e597efa95898c9c6c8c7d2eaca4560a0d2d19bd709 WatchSource:0}: Error finding container e3868b929fbe7acbb20767e597efa95898c9c6c8c7d2eaca4560a0d2d19bd709: Status 404 returned error can't find the container with id e3868b929fbe7acbb20767e597efa95898c9c6c8c7d2eaca4560a0d2d19bd709 Apr 16 14:01:23.139700 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:23.139665 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt" event={"ID":"980778ce-1a4a-40fb-ac6b-c6ce367030e2","Type":"ContainerStarted","Data":"701b7aec1be0b0050323eaffec20998332d5f41acfb9e813e82732e4883514cb"} Apr 16 14:01:23.139700 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:23.139703 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt" event={"ID":"980778ce-1a4a-40fb-ac6b-c6ce367030e2","Type":"ContainerStarted","Data":"62cf1a62d37e9d987317a3f3e30a2f2f02b69c08527883bd0e00a733713ea45d"} Apr 16 14:01:23.140865 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:23.140841 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" event={"ID":"81c9a1b1-4671-4ced-8400-416e0dfb4b49","Type":"ContainerStarted","Data":"bc8dcc9a21f33d6ac95fb97e807386585315c4a7a80b2ecf93d733cc298e2bfe"} Apr 16 14:01:23.140976 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:23.140873 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" event={"ID":"81c9a1b1-4671-4ced-8400-416e0dfb4b49","Type":"ContainerStarted","Data":"e3868b929fbe7acbb20767e597efa95898c9c6c8c7d2eaca4560a0d2d19bd709"} Apr 16 14:01:23.175913 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:23.175857 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-g66gt" podStartSLOduration=2.178948046 podStartE2EDuration="3.175842409s" podCreationTimestamp="2026-04-16 14:01:20 +0000 UTC" firstStartedPulling="2026-04-16 14:01:21.102433155 +0000 UTC m=+117.921798500" lastFinishedPulling="2026-04-16 14:01:22.099327529 +0000 UTC m=+118.918692863" observedRunningTime="2026-04-16 14:01:23.174861499 +0000 UTC m=+119.994226851" watchObservedRunningTime="2026-04-16 14:01:23.175842409 +0000 UTC m=+119.995207763" Apr 16 14:01:23.193935 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:23.193894 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-7wgs8" podStartSLOduration=1.193879358 podStartE2EDuration="1.193879358s" podCreationTimestamp="2026-04-16 14:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:23.192789708 +0000 UTC m=+120.012155060" watchObservedRunningTime="2026-04-16 14:01:23.193879358 +0000 UTC m=+120.013244700" Apr 16 14:01:23.522141 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:23.522039 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:23.522141 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:23.522097 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:23.522344 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:23.522186 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:01:23.522344 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:23.522254 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:31.522232571 +0000 UTC m=+128.341597901 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : configmap references non-existent config key: service-ca.crt Apr 16 14:01:23.522344 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:23.522280 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs podName:776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:31.522272966 +0000 UTC m=+128.341638295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs") pod "router-default-54469996f6-6h8nf" (UID: "776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1") : secret "router-metrics-certs-default" not found Apr 16 14:01:24.113261 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:24.113228 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qzz48_69e2944f-6e8b-40c7-be64-0a3d77f6c3fd/dns-node-resolver/0.log" Apr 16 14:01:25.113642 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:25.113615 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m6p2z_1e3a02e8-43e4-4f89-a584-53fbf95d94cf/node-ca/0.log" Apr 16 14:01:26.313777 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:26.313746 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-g66gt_980778ce-1a4a-40fb-ac6b-c6ce367030e2/migrator/0.log" Apr 16 14:01:26.516019 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:26.515986 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-g66gt_980778ce-1a4a-40fb-ac6b-c6ce367030e2/graceful-termination/0.log" Apr 16 14:01:31.595734 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:31.595693 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:31.596263 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:31.595750 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:31.596445 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:31.596426 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-service-ca-bundle\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:31.598175 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:31.598153 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1-metrics-certs\") pod \"router-default-54469996f6-6h8nf\" (UID: \"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1\") " pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:31.730321 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:31.730289 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-8wzjx\"" Apr 16 14:01:31.738555 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:31.738531 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:31.877958 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:31.877746 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-54469996f6-6h8nf"] Apr 16 14:01:31.881891 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:31.881865 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod776799ab_6fb3_4f6c_b80f_a5c4a9a7b6d1.slice/crio-42e5d8689e2557de8707496bf90f680915baa4450f7c310c8284765623de7da9 WatchSource:0}: Error finding container 42e5d8689e2557de8707496bf90f680915baa4450f7c310c8284765623de7da9: Status 404 returned error can't find the container with id 42e5d8689e2557de8707496bf90f680915baa4450f7c310c8284765623de7da9 Apr 16 14:01:32.164254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:32.164152 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54469996f6-6h8nf" event={"ID":"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1","Type":"ContainerStarted","Data":"a91bbefe8ec16cf37926170567246c1059928b63261ed82d965c8f5ccd1715ec"} Apr 16 14:01:32.164254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:32.164194 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54469996f6-6h8nf" event={"ID":"776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1","Type":"ContainerStarted","Data":"42e5d8689e2557de8707496bf90f680915baa4450f7c310c8284765623de7da9"} Apr 16 14:01:32.196444 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:32.196384 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-54469996f6-6h8nf" podStartSLOduration=17.196365683 podStartE2EDuration="17.196365683s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:32.194192116 +0000 UTC m=+129.013557469" watchObservedRunningTime="2026-04-16 14:01:32.196365683 +0000 UTC m=+129.015731036" Apr 16 14:01:32.739753 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:32.739715 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:32.742305 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:32.742282 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:33.166887 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:33.166856 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:33.168077 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:33.168058 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-54469996f6-6h8nf" Apr 16 14:01:33.508687 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:33.508590 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 14:01:33.511048 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:33.511013 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bdb9ab9-1a72-487e-8b6c-732d544d0454-metrics-certs\") pod \"network-metrics-daemon-5g5kq\" (UID: \"2bdb9ab9-1a72-487e-8b6c-732d544d0454\") " pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 14:01:33.784946 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:33.784867 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2qtwk\"" Apr 16 14:01:33.792851 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:33.792832 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g5kq" Apr 16 14:01:33.908583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:33.908451 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5g5kq"] Apr 16 14:01:33.911250 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:33.911221 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bdb9ab9_1a72_487e_8b6c_732d544d0454.slice/crio-56ea6194e0cae459126f874efcdf14777b83fe16d348af46221542647c7767ab WatchSource:0}: Error finding container 56ea6194e0cae459126f874efcdf14777b83fe16d348af46221542647c7767ab: Status 404 returned error can't find the container with id 56ea6194e0cae459126f874efcdf14777b83fe16d348af46221542647c7767ab Apr 16 14:01:34.169926 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:34.169897 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5g5kq" event={"ID":"2bdb9ab9-1a72-487e-8b6c-732d544d0454","Type":"ContainerStarted","Data":"56ea6194e0cae459126f874efcdf14777b83fe16d348af46221542647c7767ab"} Apr 16 14:01:35.174597 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:35.174551 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5g5kq" event={"ID":"2bdb9ab9-1a72-487e-8b6c-732d544d0454","Type":"ContainerStarted","Data":"a92bee2f4f954ecc6119c66a3848556039d9d1834a8211f9685a320e3c2754a0"} Apr 16 14:01:36.179345 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:36.179312 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5g5kq" event={"ID":"2bdb9ab9-1a72-487e-8b6c-732d544d0454","Type":"ContainerStarted","Data":"3a2ddb8813e3e1e4115db61555ab92bb181bddac16ed0eca7fb3eebc4d8e6d3c"} Apr 16 14:01:36.200446 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:36.200396 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5g5kq" podStartSLOduration=132.091142741 podStartE2EDuration="2m13.200380636s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 14:01:33.913436561 +0000 UTC m=+130.732801895" lastFinishedPulling="2026-04-16 14:01:35.022674445 +0000 UTC m=+131.842039790" observedRunningTime="2026-04-16 14:01:36.198459968 +0000 UTC m=+133.017825319" watchObservedRunningTime="2026-04-16 14:01:36.200380636 +0000 UTC m=+133.019745988" Apr 16 14:01:45.198037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.198007 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw"] Apr 16 14:01:45.200397 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.200370 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m"] Apr 16 14:01:45.200538 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.200492 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" Apr 16 14:01:45.202936 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.202916 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" Apr 16 14:01:45.206737 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.206717 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-cmtl2\"" Apr 16 14:01:45.206859 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.206719 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 14:01:45.206923 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.206890 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 14:01:45.206994 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.206972 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-b8fzb\"" Apr 16 14:01:45.207104 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.207044 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 14:01:45.214983 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.214958 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m"] Apr 16 14:01:45.219436 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.219412 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw"] Apr 16 14:01:45.222597 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.222574 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-n7cgt"] Apr 16 14:01:45.224885 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.224870 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.231669 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.231647 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:01:45.231841 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.231697 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fhg6t\"" Apr 16 14:01:45.231841 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.231645 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:01:45.232584 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.232538 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:01:45.232735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.232609 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:01:45.264109 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.264077 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n7cgt"] Apr 16 14:01:45.294168 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.294139 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bd0785b3-a6d7-4f47-a8ba-7668d4a2655d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-mjv5m\" (UID: \"bd0785b3-a6d7-4f47-a8ba-7668d4a2655d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" Apr 16 14:01:45.294399 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.294173 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/61cf3e35-c894-4e37-8617-ecc7b775e788-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-s2wmw\" (UID: \"61cf3e35-c894-4e37-8617-ecc7b775e788\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" Apr 16 14:01:45.294399 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.294300 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/61cf3e35-c894-4e37-8617-ecc7b775e788-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s2wmw\" (UID: \"61cf3e35-c894-4e37-8617-ecc7b775e788\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" Apr 16 14:01:45.394699 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.394664 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc720c25-0498-47d4-ab0f-fd387e9f2072-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.394699 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.394704 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc720c25-0498-47d4-ab0f-fd387e9f2072-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.394901 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.394728 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64456\" (UniqueName: \"kubernetes.io/projected/dc720c25-0498-47d4-ab0f-fd387e9f2072-kube-api-access-64456\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.394901 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.394830 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/61cf3e35-c894-4e37-8617-ecc7b775e788-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s2wmw\" (UID: \"61cf3e35-c894-4e37-8617-ecc7b775e788\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" Apr 16 14:01:45.394901 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.394881 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bd0785b3-a6d7-4f47-a8ba-7668d4a2655d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-mjv5m\" (UID: \"bd0785b3-a6d7-4f47-a8ba-7668d4a2655d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" Apr 16 14:01:45.395010 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.394905 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc720c25-0498-47d4-ab0f-fd387e9f2072-crio-socket\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.395010 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.394925 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/61cf3e35-c894-4e37-8617-ecc7b775e788-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-s2wmw\" (UID: \"61cf3e35-c894-4e37-8617-ecc7b775e788\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" Apr 16 14:01:45.395010 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.394941 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc720c25-0498-47d4-ab0f-fd387e9f2072-data-volume\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.395602 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.395583 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/61cf3e35-c894-4e37-8617-ecc7b775e788-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-s2wmw\" (UID: \"61cf3e35-c894-4e37-8617-ecc7b775e788\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" Apr 16 14:01:45.397292 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.397271 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/61cf3e35-c894-4e37-8617-ecc7b775e788-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s2wmw\" (UID: \"61cf3e35-c894-4e37-8617-ecc7b775e788\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" Apr 16 14:01:45.397763 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.397746 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bd0785b3-a6d7-4f47-a8ba-7668d4a2655d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-mjv5m\" (UID: \"bd0785b3-a6d7-4f47-a8ba-7668d4a2655d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" Apr 16 14:01:45.495655 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.495569 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64456\" (UniqueName: \"kubernetes.io/projected/dc720c25-0498-47d4-ab0f-fd387e9f2072-kube-api-access-64456\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.495804 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.495715 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc720c25-0498-47d4-ab0f-fd387e9f2072-crio-socket\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.495804 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.495737 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc720c25-0498-47d4-ab0f-fd387e9f2072-data-volume\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.495804 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.495779 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc720c25-0498-47d4-ab0f-fd387e9f2072-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.495923 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.495802 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc720c25-0498-47d4-ab0f-fd387e9f2072-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.495923 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.495821 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc720c25-0498-47d4-ab0f-fd387e9f2072-crio-socket\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.496136 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.496118 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc720c25-0498-47d4-ab0f-fd387e9f2072-data-volume\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.496352 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.496325 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc720c25-0498-47d4-ab0f-fd387e9f2072-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.498669 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.498648 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc720c25-0498-47d4-ab0f-fd387e9f2072-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.505034 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.505004 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64456\" (UniqueName: \"kubernetes.io/projected/dc720c25-0498-47d4-ab0f-fd387e9f2072-kube-api-access-64456\") pod \"insights-runtime-extractor-n7cgt\" (UID: \"dc720c25-0498-47d4-ab0f-fd387e9f2072\") " pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.512510 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.512491 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" Apr 16 14:01:45.518125 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.518106 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" Apr 16 14:01:45.535853 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.535816 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n7cgt" Apr 16 14:01:45.661147 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.660972 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw"] Apr 16 14:01:45.664884 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:45.664859 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61cf3e35_c894_4e37_8617_ecc7b775e788.slice/crio-448e870169bd77faa6a916cd47ed51dfaa7cf795c1c09356206d38dde20d3f28 WatchSource:0}: Error finding container 448e870169bd77faa6a916cd47ed51dfaa7cf795c1c09356206d38dde20d3f28: Status 404 returned error can't find the container with id 448e870169bd77faa6a916cd47ed51dfaa7cf795c1c09356206d38dde20d3f28 Apr 16 14:01:45.680406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.680361 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m"] Apr 16 14:01:45.682711 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:45.682679 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd0785b3_a6d7_4f47_a8ba_7668d4a2655d.slice/crio-facc7fdfa529b88df8f10f5841909f45c38eb342afde7ceb3c8f606ff39a641a WatchSource:0}: Error finding container facc7fdfa529b88df8f10f5841909f45c38eb342afde7ceb3c8f606ff39a641a: Status 404 returned error can't find the container with id facc7fdfa529b88df8f10f5841909f45c38eb342afde7ceb3c8f606ff39a641a Apr 16 14:01:45.697900 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:45.697872 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n7cgt"] Apr 16 14:01:45.703109 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:45.703081 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc720c25_0498_47d4_ab0f_fd387e9f2072.slice/crio-3ab60fb922bcb70c3603a18003ff268f0644a6b2d7031f4e7b7018660c9606b9 WatchSource:0}: Error finding container 3ab60fb922bcb70c3603a18003ff268f0644a6b2d7031f4e7b7018660c9606b9: Status 404 returned error can't find the container with id 3ab60fb922bcb70c3603a18003ff268f0644a6b2d7031f4e7b7018660c9606b9 Apr 16 14:01:46.205282 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:46.205243 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" event={"ID":"bd0785b3-a6d7-4f47-a8ba-7668d4a2655d","Type":"ContainerStarted","Data":"facc7fdfa529b88df8f10f5841909f45c38eb342afde7ceb3c8f606ff39a641a"} Apr 16 14:01:46.206173 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:46.206150 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" event={"ID":"61cf3e35-c894-4e37-8617-ecc7b775e788","Type":"ContainerStarted","Data":"448e870169bd77faa6a916cd47ed51dfaa7cf795c1c09356206d38dde20d3f28"} Apr 16 14:01:46.207277 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:46.207255 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n7cgt" event={"ID":"dc720c25-0498-47d4-ab0f-fd387e9f2072","Type":"ContainerStarted","Data":"d97cb0d66dec823f1db859e5edad4e93c8e8dd4cb6595fed71064277e29d9f7f"} Apr 16 14:01:46.207361 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:46.207282 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n7cgt" event={"ID":"dc720c25-0498-47d4-ab0f-fd387e9f2072","Type":"ContainerStarted","Data":"3ab60fb922bcb70c3603a18003ff268f0644a6b2d7031f4e7b7018660c9606b9"} Apr 16 14:01:47.211487 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:47.211456 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n7cgt" event={"ID":"dc720c25-0498-47d4-ab0f-fd387e9f2072","Type":"ContainerStarted","Data":"aa289cc3e7a44e53896ad22ae1083e533544f80c56feb305bfc203198ca97666"} Apr 16 14:01:47.212866 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:47.212838 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" event={"ID":"bd0785b3-a6d7-4f47-a8ba-7668d4a2655d","Type":"ContainerStarted","Data":"8e87d7a95a03185707252fa13622a8f08b8eea663b88c38e9bdf9692407cfaf4"} Apr 16 14:01:47.213050 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:47.213023 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" Apr 16 14:01:47.214306 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:47.214273 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" event={"ID":"61cf3e35-c894-4e37-8617-ecc7b775e788","Type":"ContainerStarted","Data":"727664433c738e6304f7579d9bc122480ba785a2b49e640bbf14ca38eed95160"} Apr 16 14:01:47.214577 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:47.214554 2564 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-9cb97cd87-mjv5m container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.133.0.14:8443/healthz\": dial tcp 10.133.0.14:8443: connect: connection refused" start-of-body= Apr 16 14:01:47.214662 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:47.214590 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" podUID="bd0785b3-a6d7-4f47-a8ba-7668d4a2655d" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.133.0.14:8443/healthz\": dial tcp 10.133.0.14:8443: connect: connection refused" Apr 16 14:01:47.248393 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:47.248339 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s2wmw" podStartSLOduration=0.764819455 podStartE2EDuration="2.248324022s" podCreationTimestamp="2026-04-16 14:01:45 +0000 UTC" firstStartedPulling="2026-04-16 14:01:45.668089382 +0000 UTC m=+142.487454727" lastFinishedPulling="2026-04-16 14:01:47.151593964 +0000 UTC m=+143.970959294" observedRunningTime="2026-04-16 14:01:47.246514725 +0000 UTC m=+144.065880078" watchObservedRunningTime="2026-04-16 14:01:47.248324022 +0000 UTC m=+144.067689374" Apr 16 14:01:47.248582 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:47.248413 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" podStartSLOduration=0.780153056 podStartE2EDuration="2.248408172s" podCreationTimestamp="2026-04-16 14:01:45 +0000 UTC" firstStartedPulling="2026-04-16 14:01:45.684564177 +0000 UTC m=+142.503929521" lastFinishedPulling="2026-04-16 14:01:47.152819306 +0000 UTC m=+143.972184637" observedRunningTime="2026-04-16 14:01:47.230302905 +0000 UTC m=+144.049668270" watchObservedRunningTime="2026-04-16 14:01:47.248408172 +0000 UTC m=+144.067773526" Apr 16 14:01:48.218417 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.218372 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n7cgt" event={"ID":"dc720c25-0498-47d4-ab0f-fd387e9f2072","Type":"ContainerStarted","Data":"92290506f1855bebd992b75b2dfc2885bc6f773e14c436f85c6c7e6fd70db8fe"} Apr 16 14:01:48.222388 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.222367 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mjv5m" Apr 16 14:01:48.238008 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.237965 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-n7cgt" podStartSLOduration=0.873006555 podStartE2EDuration="3.237953751s" podCreationTimestamp="2026-04-16 14:01:45 +0000 UTC" firstStartedPulling="2026-04-16 14:01:45.757062145 +0000 UTC m=+142.576427484" lastFinishedPulling="2026-04-16 14:01:48.122009338 +0000 UTC m=+144.941374680" observedRunningTime="2026-04-16 14:01:48.237136558 +0000 UTC m=+145.056501921" watchObservedRunningTime="2026-04-16 14:01:48.237953751 +0000 UTC m=+145.057319405" Apr 16 14:01:48.504467 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.504376 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-cvgtl"] Apr 16 14:01:48.506524 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.506501 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.509356 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.509331 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:01:48.509498 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.509372 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:01:48.509498 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.509418 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-xw5v6\"" Apr 16 14:01:48.509498 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.509426 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:01:48.509627 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.509564 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:48.510423 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.510407 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:01:48.517023 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.517000 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-cvgtl"] Apr 16 14:01:48.619198 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.619162 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6bc5cd09-359e-4c1f-b044-6dda8378ff82-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.619387 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.619241 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w782\" (UniqueName: \"kubernetes.io/projected/6bc5cd09-359e-4c1f-b044-6dda8378ff82-kube-api-access-8w782\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.619387 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.619273 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6bc5cd09-359e-4c1f-b044-6dda8378ff82-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.619387 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.619339 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6bc5cd09-359e-4c1f-b044-6dda8378ff82-metrics-client-ca\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.720076 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.720043 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6bc5cd09-359e-4c1f-b044-6dda8378ff82-metrics-client-ca\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.720271 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.720084 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6bc5cd09-359e-4c1f-b044-6dda8378ff82-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.720271 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.720121 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w782\" (UniqueName: \"kubernetes.io/projected/6bc5cd09-359e-4c1f-b044-6dda8378ff82-kube-api-access-8w782\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.720271 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.720152 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6bc5cd09-359e-4c1f-b044-6dda8378ff82-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.720397 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:48.720293 2564 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 14:01:48.720397 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:48.720366 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc5cd09-359e-4c1f-b044-6dda8378ff82-prometheus-operator-tls podName:6bc5cd09-359e-4c1f-b044-6dda8378ff82 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:49.220347378 +0000 UTC m=+146.039712724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/6bc5cd09-359e-4c1f-b044-6dda8378ff82-prometheus-operator-tls") pod "prometheus-operator-78f957474d-cvgtl" (UID: "6bc5cd09-359e-4c1f-b044-6dda8378ff82") : secret "prometheus-operator-tls" not found Apr 16 14:01:48.720804 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.720782 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6bc5cd09-359e-4c1f-b044-6dda8378ff82-metrics-client-ca\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.722639 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.722614 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6bc5cd09-359e-4c1f-b044-6dda8378ff82-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:48.736829 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:48.736804 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w782\" (UniqueName: \"kubernetes.io/projected/6bc5cd09-359e-4c1f-b044-6dda8378ff82-kube-api-access-8w782\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:49.223044 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:49.223012 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6bc5cd09-359e-4c1f-b044-6dda8378ff82-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:49.225382 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:49.225365 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6bc5cd09-359e-4c1f-b044-6dda8378ff82-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-cvgtl\" (UID: \"6bc5cd09-359e-4c1f-b044-6dda8378ff82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:49.415578 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:49.415544 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" Apr 16 14:01:49.531133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:49.531101 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-cvgtl"] Apr 16 14:01:49.533828 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:49.533802 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bc5cd09_359e_4c1f_b044_6dda8378ff82.slice/crio-bfdd171ed3c560bf90045dacf96e0380c2daed2250f016b56345e2c5fe9494f8 WatchSource:0}: Error finding container bfdd171ed3c560bf90045dacf96e0380c2daed2250f016b56345e2c5fe9494f8: Status 404 returned error can't find the container with id bfdd171ed3c560bf90045dacf96e0380c2daed2250f016b56345e2c5fe9494f8 Apr 16 14:01:50.224315 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:50.224275 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" event={"ID":"6bc5cd09-359e-4c1f-b044-6dda8378ff82","Type":"ContainerStarted","Data":"bfdd171ed3c560bf90045dacf96e0380c2daed2250f016b56345e2c5fe9494f8"} Apr 16 14:01:51.228294 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:51.228263 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" event={"ID":"6bc5cd09-359e-4c1f-b044-6dda8378ff82","Type":"ContainerStarted","Data":"aae382a9094077067ff0e5947113fca80d98a11360d8c6b3ba0d524fe7c8d55d"} Apr 16 14:01:51.228294 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:51.228298 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" event={"ID":"6bc5cd09-359e-4c1f-b044-6dda8378ff82","Type":"ContainerStarted","Data":"1d0da6f985bbac389fa9259fa98467e184286503b75b4a12e2c462f88c9a1b8c"} Apr 16 14:01:51.247372 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:51.247326 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-cvgtl" podStartSLOduration=1.962185112 podStartE2EDuration="3.24731116s" podCreationTimestamp="2026-04-16 14:01:48 +0000 UTC" firstStartedPulling="2026-04-16 14:01:49.535560168 +0000 UTC m=+146.354925511" lastFinishedPulling="2026-04-16 14:01:50.82068623 +0000 UTC m=+147.640051559" observedRunningTime="2026-04-16 14:01:51.245949866 +0000 UTC m=+148.065315230" watchObservedRunningTime="2026-04-16 14:01:51.24731116 +0000 UTC m=+148.066676508" Apr 16 14:01:52.891347 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.891314 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-2vv24"] Apr 16 14:01:52.927553 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.927474 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-2vv24"] Apr 16 14:01:52.927725 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.927570 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-l8v8h"] Apr 16 14:01:52.927725 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.927650 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:52.930055 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.930030 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-p7gqx\"" Apr 16 14:01:52.931039 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.931007 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:01:52.931039 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.931037 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:01:52.931216 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.931039 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:01:52.939970 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.939932 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:52.942516 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.942479 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:52.942636 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.942567 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qj4mz\"" Apr 16 14:01:52.942693 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.942567 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:52.942785 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:52.942769 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:53.056136 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056105 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.056136 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056142 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-textfile\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.056379 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056165 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-accelerators-collector-config\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.056379 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056191 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l5rj\" (UniqueName: \"kubernetes.io/projected/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-kube-api-access-8l5rj\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.056379 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056289 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7df76db3-990c-4985-99c1-43587fe5e0c6-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.056379 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056324 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-wtmp\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.056379 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056363 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.056614 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056394 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-root\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.056614 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056444 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.056614 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056493 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlr24\" (UniqueName: \"kubernetes.io/projected/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-api-access-xlr24\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.056614 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056549 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.056614 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056586 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-sys\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.056614 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056607 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-metrics-client-ca\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.056822 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056626 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7df76db3-990c-4985-99c1-43587fe5e0c6-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.056822 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.056644 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-tls\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157060 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.156961 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7df76db3-990c-4985-99c1-43587fe5e0c6-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.157060 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157004 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-tls\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157060 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157038 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.157376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157069 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-textfile\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157091 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-accelerators-collector-config\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157120 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l5rj\" (UniqueName: \"kubernetes.io/projected/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-kube-api-access-8l5rj\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157145 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7df76db3-990c-4985-99c1-43587fe5e0c6-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.157376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157167 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-wtmp\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157357 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-wtmp\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157653 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157415 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.157653 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157440 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7df76db3-990c-4985-99c1-43587fe5e0c6-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.157653 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157457 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-root\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157653 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157548 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-root\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157653 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157578 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.157653 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157633 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlr24\" (UniqueName: \"kubernetes.io/projected/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-api-access-xlr24\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.157934 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157681 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157934 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157713 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-sys\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157934 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157744 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-metrics-client-ca\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157934 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157757 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-textfile\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157934 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157853 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-accelerators-collector-config\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157934 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157888 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-sys\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.157934 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.157904 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.158303 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.158281 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-metrics-client-ca\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.158556 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.158532 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7df76db3-990c-4985-99c1-43587fe5e0c6-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.159966 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.159948 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.160098 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.160077 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.160245 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.160229 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.160351 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.160335 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-node-exporter-tls\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.165031 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.165012 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlr24\" (UniqueName: \"kubernetes.io/projected/7df76db3-990c-4985-99c1-43587fe5e0c6-kube-api-access-xlr24\") pod \"kube-state-metrics-7479c89684-2vv24\" (UID: \"7df76db3-990c-4985-99c1-43587fe5e0c6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.165346 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.165328 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l5rj\" (UniqueName: \"kubernetes.io/projected/406c2eef-481c-4ba1-8a6d-dd2cf2607a8d-kube-api-access-8l5rj\") pod \"node-exporter-l8v8h\" (UID: \"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d\") " pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.237798 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.237767 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" Apr 16 14:01:53.250546 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.250505 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l8v8h" Apr 16 14:01:53.259861 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:53.259822 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406c2eef_481c_4ba1_8a6d_dd2cf2607a8d.slice/crio-a85b88d51ccabacb9daaee38ec5e872de9ccb29e806099eea8242dbdd949f424 WatchSource:0}: Error finding container a85b88d51ccabacb9daaee38ec5e872de9ccb29e806099eea8242dbdd949f424: Status 404 returned error can't find the container with id a85b88d51ccabacb9daaee38ec5e872de9ccb29e806099eea8242dbdd949f424 Apr 16 14:01:53.370185 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.370146 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-2vv24"] Apr 16 14:01:53.374530 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:53.374503 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7df76db3_990c_4985_99c1_43587fe5e0c6.slice/crio-65ea56c3e7acc55cb0dc3060bd97a7aedf0af388b31ba7812cad9b4119d95871 WatchSource:0}: Error finding container 65ea56c3e7acc55cb0dc3060bd97a7aedf0af388b31ba7812cad9b4119d95871: Status 404 returned error can't find the container with id 65ea56c3e7acc55cb0dc3060bd97a7aedf0af388b31ba7812cad9b4119d95871 Apr 16 14:01:53.988231 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.988174 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:01:53.993258 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.993227 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:53.995954 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.995771 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:01:53.995954 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.995832 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-n8jll\"" Apr 16 14:01:53.995954 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.995856 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:01:53.996228 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.996137 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:01:53.996228 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.996154 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:01:53.996228 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.996185 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:01:53.996386 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.996327 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:01:53.996582 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.996567 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:01:53.996735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.996717 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:01:53.997185 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:53.996997 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:01:54.010966 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.010937 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:01:54.066643 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066603 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.066810 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066651 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.066810 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066695 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-web-config\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.066810 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066726 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.066810 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066752 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdxkr\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-kube-api-access-qdxkr\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.067028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066828 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-tls-assets\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.067028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066870 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.067028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066896 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-config-volume\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.067028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066932 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.067028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.066966 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.067280 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.067059 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.067280 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.067115 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.067280 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.067151 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-config-out\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168187 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168148 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168319 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168197 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168319 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168264 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-web-config\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168319 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168296 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168481 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168322 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdxkr\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-kube-api-access-qdxkr\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168481 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168370 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-tls-assets\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168481 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168393 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168481 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168424 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-config-volume\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168481 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168463 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168705 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168496 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168705 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168545 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168705 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168575 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168705 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168584 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.168705 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.168609 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-config-out\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.169671 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:54.169650 2564 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 14:01:54.169776 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:54.169713 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-main-tls podName:daf6c5cc-575a-4930-8027-5139ac16f669 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:54.66969492 +0000 UTC m=+151.489060250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669") : secret "alertmanager-main-tls" not found Apr 16 14:01:54.170891 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.170868 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.171407 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.171361 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.172433 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.172374 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-config-volume\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.172530 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.172487 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.172530 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.172508 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.172886 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.172844 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-config-out\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.173044 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.173022 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-web-config\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.173646 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.173618 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.174006 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.173950 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.174430 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.174375 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-tls-assets\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.177947 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.177920 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdxkr\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-kube-api-access-qdxkr\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.238007 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.237976 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l8v8h" event={"ID":"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d","Type":"ContainerStarted","Data":"fa5f03d767ed01ecf2a53facbdd15131fa8746dc8856499fe45790f9ef8e2baf"} Apr 16 14:01:54.238149 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.238019 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l8v8h" event={"ID":"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d","Type":"ContainerStarted","Data":"a85b88d51ccabacb9daaee38ec5e872de9ccb29e806099eea8242dbdd949f424"} Apr 16 14:01:54.239104 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.239039 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" event={"ID":"7df76db3-990c-4985-99c1-43587fe5e0c6","Type":"ContainerStarted","Data":"65ea56c3e7acc55cb0dc3060bd97a7aedf0af388b31ba7812cad9b4119d95871"} Apr 16 14:01:54.673793 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.673754 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.676105 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.676075 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:54.908241 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:54.908183 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:55.040986 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:55.040954 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:01:55.043338 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:55.043301 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaf6c5cc_575a_4930_8027_5139ac16f669.slice/crio-e85fc83371edf30065e8fa6a4dec8ea4acac343c276e27febbcae48383d13bbd WatchSource:0}: Error finding container e85fc83371edf30065e8fa6a4dec8ea4acac343c276e27febbcae48383d13bbd: Status 404 returned error can't find the container with id e85fc83371edf30065e8fa6a4dec8ea4acac343c276e27febbcae48383d13bbd Apr 16 14:01:55.243557 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:55.243464 2564 generic.go:358] "Generic (PLEG): container finished" podID="406c2eef-481c-4ba1-8a6d-dd2cf2607a8d" containerID="fa5f03d767ed01ecf2a53facbdd15131fa8746dc8856499fe45790f9ef8e2baf" exitCode=0 Apr 16 14:01:55.243557 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:55.243534 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l8v8h" event={"ID":"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d","Type":"ContainerDied","Data":"fa5f03d767ed01ecf2a53facbdd15131fa8746dc8856499fe45790f9ef8e2baf"} Apr 16 14:01:55.244761 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:55.244735 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerStarted","Data":"e85fc83371edf30065e8fa6a4dec8ea4acac343c276e27febbcae48383d13bbd"} Apr 16 14:01:56.250350 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:56.250247 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" event={"ID":"7df76db3-990c-4985-99c1-43587fe5e0c6","Type":"ContainerStarted","Data":"013b5c891849e56ed62ee1f5224b488be2d19d517d8a4f42889a2427d9425a9d"} Apr 16 14:01:56.250350 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:56.250303 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" event={"ID":"7df76db3-990c-4985-99c1-43587fe5e0c6","Type":"ContainerStarted","Data":"3c08999a475faa8d4a117439c0f6e3776ac0db124ea324a31639586565dbda96"} Apr 16 14:01:56.250350 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:56.250319 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" event={"ID":"7df76db3-990c-4985-99c1-43587fe5e0c6","Type":"ContainerStarted","Data":"461ad1bb0df10d020e86d19780279d8c8496bfe78c59a284aa4ea0bc75fd4b62"} Apr 16 14:01:56.252628 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:56.252600 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l8v8h" event={"ID":"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d","Type":"ContainerStarted","Data":"faf9cc6c12bdff778e16ef7ebb026d40bf0e92bb4965e4fe35dbd9c15c77f972"} Apr 16 14:01:56.252745 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:56.252636 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l8v8h" event={"ID":"406c2eef-481c-4ba1-8a6d-dd2cf2607a8d","Type":"ContainerStarted","Data":"be4fce4e884620032f3dd10077181fb283dd823482056970a1b486aecce7f8db"} Apr 16 14:01:56.272032 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:56.271965 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-2vv24" podStartSLOduration=2.00894166 podStartE2EDuration="4.271949077s" podCreationTimestamp="2026-04-16 14:01:52 +0000 UTC" firstStartedPulling="2026-04-16 14:01:53.376583738 +0000 UTC m=+150.195949068" lastFinishedPulling="2026-04-16 14:01:55.639591141 +0000 UTC m=+152.458956485" observedRunningTime="2026-04-16 14:01:56.270972573 +0000 UTC m=+153.090337924" watchObservedRunningTime="2026-04-16 14:01:56.271949077 +0000 UTC m=+153.091314429" Apr 16 14:01:56.291344 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:56.291285 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-l8v8h" podStartSLOduration=3.403796841 podStartE2EDuration="4.291267625s" podCreationTimestamp="2026-04-16 14:01:52 +0000 UTC" firstStartedPulling="2026-04-16 14:01:53.261940266 +0000 UTC m=+150.081305596" lastFinishedPulling="2026-04-16 14:01:54.149411047 +0000 UTC m=+150.968776380" observedRunningTime="2026-04-16 14:01:56.289537007 +0000 UTC m=+153.108902359" watchObservedRunningTime="2026-04-16 14:01:56.291267625 +0000 UTC m=+153.110632982" Apr 16 14:01:57.256531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.256499 2564 generic.go:358] "Generic (PLEG): container finished" podID="daf6c5cc-575a-4930-8027-5139ac16f669" containerID="c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397" exitCode=0 Apr 16 14:01:57.256919 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.256578 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerDied","Data":"c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397"} Apr 16 14:01:57.298257 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.298223 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7bd8c88865-rst2t"] Apr 16 14:01:57.301509 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.301492 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.304148 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.304114 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4eccm12uk61d9\"" Apr 16 14:01:57.304148 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.304131 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:01:57.304326 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.304195 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:01:57.304326 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.304226 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:01:57.304326 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.304297 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-zgfz6\"" Apr 16 14:01:57.304534 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.304518 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:01:57.309738 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.309719 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7bd8c88865-rst2t"] Apr 16 14:01:57.402371 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.402333 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7536256b-802a-4d61-bdb9-1dbff314c031-audit-log\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.402575 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.402393 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7536256b-802a-4d61-bdb9-1dbff314c031-secret-metrics-server-tls\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.402575 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.402414 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7536256b-802a-4d61-bdb9-1dbff314c031-client-ca-bundle\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.402575 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.402494 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9q4m\" (UniqueName: \"kubernetes.io/projected/7536256b-802a-4d61-bdb9-1dbff314c031-kube-api-access-w9q4m\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.402575 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.402534 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7536256b-802a-4d61-bdb9-1dbff314c031-secret-metrics-server-client-certs\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.402731 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.402603 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7536256b-802a-4d61-bdb9-1dbff314c031-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.402731 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.402657 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7536256b-802a-4d61-bdb9-1dbff314c031-metrics-server-audit-profiles\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.503928 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.503888 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7536256b-802a-4d61-bdb9-1dbff314c031-metrics-server-audit-profiles\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.504083 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.503943 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7536256b-802a-4d61-bdb9-1dbff314c031-audit-log\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.504083 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.503990 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7536256b-802a-4d61-bdb9-1dbff314c031-secret-metrics-server-tls\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.504083 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.504013 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7536256b-802a-4d61-bdb9-1dbff314c031-client-ca-bundle\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.504083 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.504058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9q4m\" (UniqueName: \"kubernetes.io/projected/7536256b-802a-4d61-bdb9-1dbff314c031-kube-api-access-w9q4m\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.504347 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.504084 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7536256b-802a-4d61-bdb9-1dbff314c031-secret-metrics-server-client-certs\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.504347 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.504124 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7536256b-802a-4d61-bdb9-1dbff314c031-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.504451 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.504410 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7536256b-802a-4d61-bdb9-1dbff314c031-audit-log\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.504808 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.504789 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7536256b-802a-4d61-bdb9-1dbff314c031-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.505036 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.505012 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7536256b-802a-4d61-bdb9-1dbff314c031-metrics-server-audit-profiles\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.506510 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.506491 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7536256b-802a-4d61-bdb9-1dbff314c031-client-ca-bundle\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.506606 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.506528 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7536256b-802a-4d61-bdb9-1dbff314c031-secret-metrics-server-tls\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.506644 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.506602 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7536256b-802a-4d61-bdb9-1dbff314c031-secret-metrics-server-client-certs\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.512174 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.512156 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9q4m\" (UniqueName: \"kubernetes.io/projected/7536256b-802a-4d61-bdb9-1dbff314c031-kube-api-access-w9q4m\") pod \"metrics-server-7bd8c88865-rst2t\" (UID: \"7536256b-802a-4d61-bdb9-1dbff314c031\") " pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.613423 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.613379 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:01:57.665864 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.665827 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7"] Apr 16 14:01:57.674167 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.671355 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" Apr 16 14:01:57.674535 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.674389 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6vbq6\"" Apr 16 14:01:57.674801 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.674782 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:01:57.677142 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.677072 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7"] Apr 16 14:01:57.705959 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.705923 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ae97fff3-1d01-497a-b991-d35c0280495f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-fnms7\" (UID: \"ae97fff3-1d01-497a-b991-d35c0280495f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" Apr 16 14:01:57.754662 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.754633 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7bd8c88865-rst2t"] Apr 16 14:01:57.758275 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:57.758181 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7536256b_802a_4d61_bdb9_1dbff314c031.slice/crio-b0b4e4e1214359032f2866908e3a50bcc1a6ac0a24ab5f0e084c72f08f2a650f WatchSource:0}: Error finding container b0b4e4e1214359032f2866908e3a50bcc1a6ac0a24ab5f0e084c72f08f2a650f: Status 404 returned error can't find the container with id b0b4e4e1214359032f2866908e3a50bcc1a6ac0a24ab5f0e084c72f08f2a650f Apr 16 14:01:57.807334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:57.807291 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ae97fff3-1d01-497a-b991-d35c0280495f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-fnms7\" (UID: \"ae97fff3-1d01-497a-b991-d35c0280495f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" Apr 16 14:01:57.807508 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:57.807430 2564 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 14:01:57.807569 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:57.807523 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae97fff3-1d01-497a-b991-d35c0280495f-monitoring-plugin-cert podName:ae97fff3-1d01-497a-b991-d35c0280495f nodeName:}" failed. No retries permitted until 2026-04-16 14:01:58.30750328 +0000 UTC m=+155.126868610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/ae97fff3-1d01-497a-b991-d35c0280495f-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-fnms7" (UID: "ae97fff3-1d01-497a-b991-d35c0280495f") : secret "monitoring-plugin-cert" not found Apr 16 14:01:58.133117 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.133084 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-b956596f6-2b254"] Apr 16 14:01:58.136571 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.136551 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.139177 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.139145 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 14:01:58.139177 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.139163 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 14:01:58.139367 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.139257 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 14:01:58.139367 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.139263 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 14:01:58.139587 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.139570 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 14:01:58.139663 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.139603 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-qksmz\"" Apr 16 14:01:58.146017 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.145996 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 14:01:58.148957 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.148932 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b956596f6-2b254"] Apr 16 14:01:58.211062 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.211025 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.211233 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.211074 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b883d92-fa45-47b1-a3aa-7690646c7936-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.211233 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.211109 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b883d92-fa45-47b1-a3aa-7690646c7936-serving-certs-ca-bundle\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.211329 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.211229 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-federate-client-tls\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.211329 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.211282 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdhk4\" (UniqueName: \"kubernetes.io/projected/6b883d92-fa45-47b1-a3aa-7690646c7936-kube-api-access-gdhk4\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.211329 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.211310 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-telemeter-client-tls\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.211468 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.211339 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b883d92-fa45-47b1-a3aa-7690646c7936-metrics-client-ca\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.211542 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.211520 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-secret-telemeter-client\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.261049 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.261008 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" event={"ID":"7536256b-802a-4d61-bdb9-1dbff314c031","Type":"ContainerStarted","Data":"b0b4e4e1214359032f2866908e3a50bcc1a6ac0a24ab5f0e084c72f08f2a650f"} Apr 16 14:01:58.312839 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.312804 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.313037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.312849 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b883d92-fa45-47b1-a3aa-7690646c7936-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.313037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.312884 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b883d92-fa45-47b1-a3aa-7690646c7936-serving-certs-ca-bundle\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.313037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.312921 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-federate-client-tls\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.313037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.312959 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdhk4\" (UniqueName: \"kubernetes.io/projected/6b883d92-fa45-47b1-a3aa-7690646c7936-kube-api-access-gdhk4\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.313037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.312984 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-telemeter-client-tls\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.313298 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.313184 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b883d92-fa45-47b1-a3aa-7690646c7936-metrics-client-ca\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.313351 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.313337 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ae97fff3-1d01-497a-b991-d35c0280495f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-fnms7\" (UID: \"ae97fff3-1d01-497a-b991-d35c0280495f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" Apr 16 14:01:58.313407 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.313393 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-secret-telemeter-client\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.314036 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.314006 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b883d92-fa45-47b1-a3aa-7690646c7936-metrics-client-ca\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.314133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.314100 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b883d92-fa45-47b1-a3aa-7690646c7936-serving-certs-ca-bundle\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.314194 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.314167 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b883d92-fa45-47b1-a3aa-7690646c7936-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.316179 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.316150 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.316319 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.316186 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ae97fff3-1d01-497a-b991-d35c0280495f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-fnms7\" (UID: \"ae97fff3-1d01-497a-b991-d35c0280495f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" Apr 16 14:01:58.316394 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.316373 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-federate-client-tls\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.316445 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.316405 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-telemeter-client-tls\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.316847 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.316822 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6b883d92-fa45-47b1-a3aa-7690646c7936-secret-telemeter-client\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.322414 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.322380 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdhk4\" (UniqueName: \"kubernetes.io/projected/6b883d92-fa45-47b1-a3aa-7690646c7936-kube-api-access-gdhk4\") pod \"telemeter-client-b956596f6-2b254\" (UID: \"6b883d92-fa45-47b1-a3aa-7690646c7936\") " pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.447537 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.447445 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b956596f6-2b254" Apr 16 14:01:58.587190 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.587166 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" Apr 16 14:01:58.594061 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.594014 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b956596f6-2b254"] Apr 16 14:01:58.599712 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:58.599673 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b883d92_fa45_47b1_a3aa_7690646c7936.slice/crio-69f85d4e4655a3111870102178c6befe9c38c1bb5c890745b6d4361a53a65bab WatchSource:0}: Error finding container 69f85d4e4655a3111870102178c6befe9c38c1bb5c890745b6d4361a53a65bab: Status 404 returned error can't find the container with id 69f85d4e4655a3111870102178c6befe9c38c1bb5c890745b6d4361a53a65bab Apr 16 14:01:58.716392 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:58.716368 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7"] Apr 16 14:01:58.726629 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:58.726600 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae97fff3_1d01_497a_b991_d35c0280495f.slice/crio-a9ca7d5caa0995b3aa02826da8abb0e28b3808b040df2903d5233c57145ecae9 WatchSource:0}: Error finding container a9ca7d5caa0995b3aa02826da8abb0e28b3808b040df2903d5233c57145ecae9: Status 404 returned error can't find the container with id a9ca7d5caa0995b3aa02826da8abb0e28b3808b040df2903d5233c57145ecae9 Apr 16 14:01:59.167580 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.167482 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:59.172249 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.172223 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.175193 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.175162 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:01:59.175346 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.175326 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:01:59.175389 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.175346 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ff96megmomqat\"" Apr 16 14:01:59.175588 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.175570 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:01:59.175849 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.175823 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:01:59.177028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.176368 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:01:59.177028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.176379 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:01:59.177028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.176369 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-vwv82\"" Apr 16 14:01:59.177028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.176621 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:01:59.177028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.176651 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:01:59.177028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.176739 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:01:59.177028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.176843 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:01:59.179910 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.179890 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:01:59.185764 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.183530 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:01:59.191553 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.191513 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:59.221856 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.221828 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.221856 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.221867 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222066 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.221896 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222066 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.221919 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222066 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222012 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222259 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222104 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222259 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222147 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-web-config\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222259 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222178 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222386 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222255 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222386 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222323 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222386 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222378 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-config-out\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222527 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222401 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222527 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222438 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222527 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222474 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222672 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222544 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-config\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222672 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222572 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4r57\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-kube-api-access-b4r57\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222672 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222598 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.222672 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.222627 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.266147 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.265630 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b956596f6-2b254" event={"ID":"6b883d92-fa45-47b1-a3aa-7690646c7936","Type":"ContainerStarted","Data":"69f85d4e4655a3111870102178c6befe9c38c1bb5c890745b6d4361a53a65bab"} Apr 16 14:01:59.268845 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.268805 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerStarted","Data":"60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695"} Apr 16 14:01:59.268990 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.268858 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerStarted","Data":"9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083"} Apr 16 14:01:59.268990 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.268874 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerStarted","Data":"09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256"} Apr 16 14:01:59.268990 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.268888 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerStarted","Data":"78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad"} Apr 16 14:01:59.268990 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.268901 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerStarted","Data":"44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73"} Apr 16 14:01:59.269948 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.269923 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" event={"ID":"ae97fff3-1d01-497a-b991-d35c0280495f","Type":"ContainerStarted","Data":"a9ca7d5caa0995b3aa02826da8abb0e28b3808b040df2903d5233c57145ecae9"} Apr 16 14:01:59.324139 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324090 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324323 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324161 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324323 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324230 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-web-config\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324323 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324264 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324323 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324298 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324327 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324362 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-config-out\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324383 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324418 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324443 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324462 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-config\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324476 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4r57\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-kube-api-access-b4r57\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324493 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324531 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324510 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324877 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324548 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324877 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324578 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324877 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324607 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.324877 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.324664 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.325665 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.325382 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.325665 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.325569 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.328083 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.325861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.328083 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.326651 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.328584 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.328560 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.329650 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.329177 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.329650 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.329281 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.329650 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.329581 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.329855 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.329787 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.331515 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.331467 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.331929 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.331711 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.332291 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.332225 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-config-out\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.332651 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.332620 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.332651 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.332647 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.333045 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.333018 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.338468 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.333318 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-config\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.338468 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.333594 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-web-config\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.344746 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.344720 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4r57\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-kube-api-access-b4r57\") pod \"prometheus-k8s-0\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.490556 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.490462 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.579432 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:59.579375 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-gmqgq" podUID="99722c79-e7f0-4687-85aa-06ca3ad840a2" Apr 16 14:01:59.598787 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:01:59.598736 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-t7d2w" podUID="68dcca7e-5f78-4645-b5f2-2e8fe47395cc" Apr 16 14:01:59.920063 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:01:59.919990 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:59.923890 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:01:59.923861 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a9dd19_6abe_420f_9d15_7618336ffece.slice/crio-7679439f1fb0a968e0a78584647989eabcd3a147ce2788d74e0884c7fccfca43 WatchSource:0}: Error finding container 7679439f1fb0a968e0a78584647989eabcd3a147ce2788d74e0884c7fccfca43: Status 404 returned error can't find the container with id 7679439f1fb0a968e0a78584647989eabcd3a147ce2788d74e0884c7fccfca43 Apr 16 14:02:00.276088 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:00.276028 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" event={"ID":"7536256b-802a-4d61-bdb9-1dbff314c031","Type":"ContainerStarted","Data":"7148079ae5a91c05381d8976adc7a7af2874f141601789fcfc4c7d95eb916b2a"} Apr 16 14:02:00.278406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:00.278324 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerStarted","Data":"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5"} Apr 16 14:02:00.278406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:00.278350 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 14:02:00.278567 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:00.278361 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerStarted","Data":"7679439f1fb0a968e0a78584647989eabcd3a147ce2788d74e0884c7fccfca43"} Apr 16 14:02:00.278636 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:00.278572 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gmqgq" Apr 16 14:02:00.295443 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:00.294403 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" podStartSLOduration=1.249252551 podStartE2EDuration="3.294383373s" podCreationTimestamp="2026-04-16 14:01:57 +0000 UTC" firstStartedPulling="2026-04-16 14:01:57.760686616 +0000 UTC m=+154.580051946" lastFinishedPulling="2026-04-16 14:01:59.805817434 +0000 UTC m=+156.625182768" observedRunningTime="2026-04-16 14:02:00.293086253 +0000 UTC m=+157.112451602" watchObservedRunningTime="2026-04-16 14:02:00.294383373 +0000 UTC m=+157.113748726" Apr 16 14:02:01.282803 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.282768 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" event={"ID":"ae97fff3-1d01-497a-b991-d35c0280495f","Type":"ContainerStarted","Data":"8615798ecc8ecfac41bb2a053db85e0a6b54e6b088ab43789e043ad4b1f52cdc"} Apr 16 14:02:01.283271 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.282965 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" Apr 16 14:02:01.284194 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.284167 2564 generic.go:358] "Generic (PLEG): container finished" podID="57a9dd19-6abe-420f-9d15-7618336ffece" containerID="2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5" exitCode=0 Apr 16 14:02:01.284320 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.284235 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerDied","Data":"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5"} Apr 16 14:02:01.286256 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.286231 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b956596f6-2b254" event={"ID":"6b883d92-fa45-47b1-a3aa-7690646c7936","Type":"ContainerStarted","Data":"40a8953553e7208f90d3a37cbe9f6cfd62320ffd4614be2e6039c6517b593508"} Apr 16 14:02:01.286344 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.286263 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b956596f6-2b254" event={"ID":"6b883d92-fa45-47b1-a3aa-7690646c7936","Type":"ContainerStarted","Data":"4ef55277b1f5deeefd7e24271b71fdf7afe62f3ebdc5a5b74cc01f7dea202a3b"} Apr 16 14:02:01.286344 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.286278 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b956596f6-2b254" event={"ID":"6b883d92-fa45-47b1-a3aa-7690646c7936","Type":"ContainerStarted","Data":"9f5c2f975cb261d8454a2c06c53ce4b858fba1d4f992258709c8e16cff541e7d"} Apr 16 14:02:01.288698 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.288682 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" Apr 16 14:02:01.289535 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.289517 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerStarted","Data":"7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5"} Apr 16 14:02:01.303711 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.303669 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-fnms7" podStartSLOduration=1.994828021 podStartE2EDuration="4.303656585s" podCreationTimestamp="2026-04-16 14:01:57 +0000 UTC" firstStartedPulling="2026-04-16 14:01:58.728342807 +0000 UTC m=+155.547708137" lastFinishedPulling="2026-04-16 14:02:01.037171366 +0000 UTC m=+157.856536701" observedRunningTime="2026-04-16 14:02:01.301907885 +0000 UTC m=+158.121273238" watchObservedRunningTime="2026-04-16 14:02:01.303656585 +0000 UTC m=+158.123021934" Apr 16 14:02:01.350609 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.350551 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.307419461 podStartE2EDuration="8.350534366s" podCreationTimestamp="2026-04-16 14:01:53 +0000 UTC" firstStartedPulling="2026-04-16 14:01:55.045445466 +0000 UTC m=+151.864810795" lastFinishedPulling="2026-04-16 14:02:01.088560361 +0000 UTC m=+157.907925700" observedRunningTime="2026-04-16 14:02:01.347796593 +0000 UTC m=+158.167161956" watchObservedRunningTime="2026-04-16 14:02:01.350534366 +0000 UTC m=+158.169899719" Apr 16 14:02:01.372237 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:01.372119 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-b956596f6-2b254" podStartSLOduration=0.936782544 podStartE2EDuration="3.372101466s" podCreationTimestamp="2026-04-16 14:01:58 +0000 UTC" firstStartedPulling="2026-04-16 14:01:58.601826449 +0000 UTC m=+155.421191779" lastFinishedPulling="2026-04-16 14:02:01.037145368 +0000 UTC m=+157.856510701" observedRunningTime="2026-04-16 14:02:01.371177792 +0000 UTC m=+158.190543148" watchObservedRunningTime="2026-04-16 14:02:01.372101466 +0000 UTC m=+158.191466819" Apr 16 14:02:04.304721 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.304679 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerStarted","Data":"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97"} Apr 16 14:02:04.304721 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.304731 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerStarted","Data":"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9"} Apr 16 14:02:04.474615 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.474527 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 14:02:04.474615 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.474574 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 14:02:04.477008 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.476989 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99722c79-e7f0-4687-85aa-06ca3ad840a2-metrics-tls\") pod \"dns-default-gmqgq\" (UID: \"99722c79-e7f0-4687-85aa-06ca3ad840a2\") " pod="openshift-dns/dns-default-gmqgq" Apr 16 14:02:04.477070 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.477023 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68dcca7e-5f78-4645-b5f2-2e8fe47395cc-cert\") pod \"ingress-canary-t7d2w\" (UID: \"68dcca7e-5f78-4645-b5f2-2e8fe47395cc\") " pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 14:02:04.482346 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.482331 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5ngv\"" Apr 16 14:02:04.482398 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.482331 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-r2bfh\"" Apr 16 14:02:04.489918 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.489904 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t7d2w" Apr 16 14:02:04.490001 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.489987 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gmqgq" Apr 16 14:02:04.618248 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.618220 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gmqgq"] Apr 16 14:02:04.620316 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:02:04.620288 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99722c79_e7f0_4687_85aa_06ca3ad840a2.slice/crio-ab1354bff3687c7a306f5a051757fe9da2b594966c71957c2113ac0b34eb698e WatchSource:0}: Error finding container ab1354bff3687c7a306f5a051757fe9da2b594966c71957c2113ac0b34eb698e: Status 404 returned error can't find the container with id ab1354bff3687c7a306f5a051757fe9da2b594966c71957c2113ac0b34eb698e Apr 16 14:02:04.638926 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:04.638903 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t7d2w"] Apr 16 14:02:04.641235 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:02:04.641185 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68dcca7e_5f78_4645_b5f2_2e8fe47395cc.slice/crio-126fab163cc7b4f7cf66733b626fd6946f48713b1f1881b021a686ac8469cd65 WatchSource:0}: Error finding container 126fab163cc7b4f7cf66733b626fd6946f48713b1f1881b021a686ac8469cd65: Status 404 returned error can't find the container with id 126fab163cc7b4f7cf66733b626fd6946f48713b1f1881b021a686ac8469cd65 Apr 16 14:02:05.310052 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:05.309992 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gmqgq" event={"ID":"99722c79-e7f0-4687-85aa-06ca3ad840a2","Type":"ContainerStarted","Data":"ab1354bff3687c7a306f5a051757fe9da2b594966c71957c2113ac0b34eb698e"} Apr 16 14:02:05.311197 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:05.311165 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t7d2w" event={"ID":"68dcca7e-5f78-4645-b5f2-2e8fe47395cc","Type":"ContainerStarted","Data":"126fab163cc7b4f7cf66733b626fd6946f48713b1f1881b021a686ac8469cd65"} Apr 16 14:02:07.318905 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:07.318866 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gmqgq" event={"ID":"99722c79-e7f0-4687-85aa-06ca3ad840a2","Type":"ContainerStarted","Data":"154e34604dfe3c83397881614ae00ff3c4d4af0989bd00a7970d9fd10fee936f"} Apr 16 14:02:07.321194 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:07.321172 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerStarted","Data":"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4"} Apr 16 14:02:07.321326 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:07.321198 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerStarted","Data":"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b"} Apr 16 14:02:08.325952 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:08.325919 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t7d2w" event={"ID":"68dcca7e-5f78-4645-b5f2-2e8fe47395cc","Type":"ContainerStarted","Data":"28d73a5ec52e2d2b9eecb27644ca9dd6168ba3f2bc6ef86e851bc905956b28d4"} Apr 16 14:02:08.327338 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:08.327310 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gmqgq" event={"ID":"99722c79-e7f0-4687-85aa-06ca3ad840a2","Type":"ContainerStarted","Data":"9230255bbd671eb4836305fd60fa9ad463889210103bcb9c3324ef2421cf989c"} Apr 16 14:02:08.327456 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:08.327428 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gmqgq" Apr 16 14:02:08.330124 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:08.330102 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerStarted","Data":"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506"} Apr 16 14:02:08.330238 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:08.330129 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerStarted","Data":"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda"} Apr 16 14:02:08.343639 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:08.343592 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t7d2w" podStartSLOduration=129.607863909 podStartE2EDuration="2m12.343574108s" podCreationTimestamp="2026-04-16 13:59:56 +0000 UTC" firstStartedPulling="2026-04-16 14:02:04.642976287 +0000 UTC m=+161.462341617" lastFinishedPulling="2026-04-16 14:02:07.378686472 +0000 UTC m=+164.198051816" observedRunningTime="2026-04-16 14:02:08.342651232 +0000 UTC m=+165.162016598" watchObservedRunningTime="2026-04-16 14:02:08.343574108 +0000 UTC m=+165.162939461" Apr 16 14:02:08.360311 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:08.360258 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gmqgq" podStartSLOduration=130.26919715 podStartE2EDuration="2m12.360240523s" podCreationTimestamp="2026-04-16 13:59:56 +0000 UTC" firstStartedPulling="2026-04-16 14:02:04.62214145 +0000 UTC m=+161.441506794" lastFinishedPulling="2026-04-16 14:02:06.713184825 +0000 UTC m=+163.532550167" observedRunningTime="2026-04-16 14:02:08.360051675 +0000 UTC m=+165.179417028" watchObservedRunningTime="2026-04-16 14:02:08.360240523 +0000 UTC m=+165.179605879" Apr 16 14:02:08.386852 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:08.386800 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.953521057 podStartE2EDuration="9.386783397s" podCreationTimestamp="2026-04-16 14:01:59 +0000 UTC" firstStartedPulling="2026-04-16 14:02:01.285594039 +0000 UTC m=+158.104959384" lastFinishedPulling="2026-04-16 14:02:06.718856392 +0000 UTC m=+163.538221724" observedRunningTime="2026-04-16 14:02:08.385786956 +0000 UTC m=+165.205152319" watchObservedRunningTime="2026-04-16 14:02:08.386783397 +0000 UTC m=+165.206148750" Apr 16 14:02:09.490799 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:09.490761 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:17.613982 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:17.613945 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:02:17.614462 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:17.613996 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:02:18.335027 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:18.334998 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gmqgq" Apr 16 14:02:37.619610 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:37.619577 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:02:37.623367 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:37.623345 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7bd8c88865-rst2t" Apr 16 14:02:39.427257 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:39.427220 2564 generic.go:358] "Generic (PLEG): container finished" podID="e4ab2e12-9693-40bc-9dd6-fc8e26d75737" containerID="392975e570554e3580557b99ba8f831ce8f90c3cc22afcf1b0918b3f09c76cd1" exitCode=0 Apr 16 14:02:39.427257 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:39.427234 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" event={"ID":"e4ab2e12-9693-40bc-9dd6-fc8e26d75737","Type":"ContainerDied","Data":"392975e570554e3580557b99ba8f831ce8f90c3cc22afcf1b0918b3f09c76cd1"} Apr 16 14:02:39.427735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:39.427586 2564 scope.go:117] "RemoveContainer" containerID="392975e570554e3580557b99ba8f831ce8f90c3cc22afcf1b0918b3f09c76cd1" Apr 16 14:02:40.432116 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:40.432082 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bdglv" event={"ID":"e4ab2e12-9693-40bc-9dd6-fc8e26d75737","Type":"ContainerStarted","Data":"373e19bc44d7d24bb59863e86eb5d26260d3bf6d22fbbd234dfa7a06612f2b54"} Apr 16 14:02:59.491079 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:59.491045 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:59.506592 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:02:59.506562 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:00.509474 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:00.509449 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:13.261266 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.261232 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:13.261811 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.261754 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="alertmanager" containerID="cri-o://44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73" gracePeriod=120 Apr 16 14:03:13.261943 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.261836 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy-metric" containerID="cri-o://60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695" gracePeriod=120 Apr 16 14:03:13.261943 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.261855 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="prom-label-proxy" containerID="cri-o://7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5" gracePeriod=120 Apr 16 14:03:13.261943 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.261908 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy" containerID="cri-o://9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083" gracePeriod=120 Apr 16 14:03:13.261943 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.261894 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="config-reloader" containerID="cri-o://78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad" gracePeriod=120 Apr 16 14:03:13.261943 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.261866 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy-web" containerID="cri-o://09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256" gracePeriod=120 Apr 16 14:03:13.532571 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.532488 2564 generic.go:358] "Generic (PLEG): container finished" podID="daf6c5cc-575a-4930-8027-5139ac16f669" containerID="7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5" exitCode=0 Apr 16 14:03:13.532571 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.532513 2564 generic.go:358] "Generic (PLEG): container finished" podID="daf6c5cc-575a-4930-8027-5139ac16f669" containerID="9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083" exitCode=0 Apr 16 14:03:13.532571 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.532519 2564 generic.go:358] "Generic (PLEG): container finished" podID="daf6c5cc-575a-4930-8027-5139ac16f669" containerID="78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad" exitCode=0 Apr 16 14:03:13.532571 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.532524 2564 generic.go:358] "Generic (PLEG): container finished" podID="daf6c5cc-575a-4930-8027-5139ac16f669" containerID="44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73" exitCode=0 Apr 16 14:03:13.532571 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.532548 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerDied","Data":"7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5"} Apr 16 14:03:13.532571 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.532569 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerDied","Data":"9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083"} Apr 16 14:03:13.532852 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.532579 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerDied","Data":"78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad"} Apr 16 14:03:13.532852 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:13.532588 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerDied","Data":"44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73"} Apr 16 14:03:14.516052 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.516028 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.537253 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.537224 2564 generic.go:358] "Generic (PLEG): container finished" podID="daf6c5cc-575a-4930-8027-5139ac16f669" containerID="60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695" exitCode=0 Apr 16 14:03:14.537253 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.537248 2564 generic.go:358] "Generic (PLEG): container finished" podID="daf6c5cc-575a-4930-8027-5139ac16f669" containerID="09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256" exitCode=0 Apr 16 14:03:14.537449 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.537307 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerDied","Data":"60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695"} Apr 16 14:03:14.537449 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.537322 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.537449 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.537346 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerDied","Data":"09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256"} Apr 16 14:03:14.537449 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.537358 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"daf6c5cc-575a-4930-8027-5139ac16f669","Type":"ContainerDied","Data":"e85fc83371edf30065e8fa6a4dec8ea4acac343c276e27febbcae48383d13bbd"} Apr 16 14:03:14.537449 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.537373 2564 scope.go:117] "RemoveContainer" containerID="7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5" Apr 16 14:03:14.545422 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.545382 2564 scope.go:117] "RemoveContainer" containerID="60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695" Apr 16 14:03:14.555177 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.555160 2564 scope.go:117] "RemoveContainer" containerID="9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083" Apr 16 14:03:14.564031 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.564009 2564 scope.go:117] "RemoveContainer" containerID="09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256" Apr 16 14:03:14.570997 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.570977 2564 scope.go:117] "RemoveContainer" containerID="78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad" Apr 16 14:03:14.576142 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576119 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-config-out\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.576268 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576178 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-web-config\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.576268 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576225 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-config-volume\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.576268 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576250 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-metric\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.576433 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576276 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-main-tls\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.576433 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576307 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdxkr\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-kube-api-access-qdxkr\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576742 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-metrics-client-ca\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576786 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-trusted-ca-bundle\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576844 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-web\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576882 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-tls-assets\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576918 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.576948 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-cluster-tls-config\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.577063 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-main-db\") pod \"daf6c5cc-575a-4930-8027-5139ac16f669\" (UID: \"daf6c5cc-575a-4930-8027-5139ac16f669\") " Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.577272 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.577662 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.578434 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.578827 2564 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-main-db\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.578860 2564 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-metrics-client-ca\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.579660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.578879 2564 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daf6c5cc-575a-4930-8027-5139ac16f669-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.580587 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.580192 2564 scope.go:117] "RemoveContainer" containerID="44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73" Apr 16 14:03:14.581532 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.581404 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-config-out" (OuterVolumeSpecName: "config-out") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:14.581532 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.581404 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:14.581532 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.581481 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-config-volume" (OuterVolumeSpecName: "config-volume") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:14.581532 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.581524 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:14.582250 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.582118 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:14.582250 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.582188 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:14.582408 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.582389 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-kube-api-access-qdxkr" (OuterVolumeSpecName: "kube-api-access-qdxkr") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "kube-api-access-qdxkr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:14.583048 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.583024 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:14.586716 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.586693 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:14.593227 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.593182 2564 scope.go:117] "RemoveContainer" containerID="c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397" Apr 16 14:03:14.593929 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.593904 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-web-config" (OuterVolumeSpecName: "web-config") pod "daf6c5cc-575a-4930-8027-5139ac16f669" (UID: "daf6c5cc-575a-4930-8027-5139ac16f669"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:14.599947 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.599931 2564 scope.go:117] "RemoveContainer" containerID="7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5" Apr 16 14:03:14.600175 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:14.600158 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5\": container with ID starting with 7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5 not found: ID does not exist" containerID="7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5" Apr 16 14:03:14.600253 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.600184 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5"} err="failed to get container status \"7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5\": rpc error: code = NotFound desc = could not find container \"7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5\": container with ID starting with 7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5 not found: ID does not exist" Apr 16 14:03:14.600253 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.600237 2564 scope.go:117] "RemoveContainer" containerID="60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695" Apr 16 14:03:14.600485 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:14.600466 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695\": container with ID starting with 60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695 not found: ID does not exist" containerID="60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695" Apr 16 14:03:14.600556 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.600491 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695"} err="failed to get container status \"60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695\": rpc error: code = NotFound desc = could not find container \"60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695\": container with ID starting with 60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695 not found: ID does not exist" Apr 16 14:03:14.600556 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.600506 2564 scope.go:117] "RemoveContainer" containerID="9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083" Apr 16 14:03:14.600754 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:14.600739 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083\": container with ID starting with 9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083 not found: ID does not exist" containerID="9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083" Apr 16 14:03:14.600794 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.600758 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083"} err="failed to get container status \"9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083\": rpc error: code = NotFound desc = could not find container \"9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083\": container with ID starting with 9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083 not found: ID does not exist" Apr 16 14:03:14.600794 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.600774 2564 scope.go:117] "RemoveContainer" containerID="09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256" Apr 16 14:03:14.601011 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:14.600995 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256\": container with ID starting with 09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256 not found: ID does not exist" containerID="09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256" Apr 16 14:03:14.601051 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601018 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256"} err="failed to get container status \"09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256\": rpc error: code = NotFound desc = could not find container \"09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256\": container with ID starting with 09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256 not found: ID does not exist" Apr 16 14:03:14.601051 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601034 2564 scope.go:117] "RemoveContainer" containerID="78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad" Apr 16 14:03:14.601285 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:14.601269 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad\": container with ID starting with 78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad not found: ID does not exist" containerID="78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad" Apr 16 14:03:14.601332 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601288 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad"} err="failed to get container status \"78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad\": rpc error: code = NotFound desc = could not find container \"78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad\": container with ID starting with 78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad not found: ID does not exist" Apr 16 14:03:14.601332 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601301 2564 scope.go:117] "RemoveContainer" containerID="44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73" Apr 16 14:03:14.601533 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:14.601518 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73\": container with ID starting with 44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73 not found: ID does not exist" containerID="44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73" Apr 16 14:03:14.601567 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601538 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73"} err="failed to get container status \"44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73\": rpc error: code = NotFound desc = could not find container \"44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73\": container with ID starting with 44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73 not found: ID does not exist" Apr 16 14:03:14.601567 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601552 2564 scope.go:117] "RemoveContainer" containerID="c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397" Apr 16 14:03:14.601791 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:14.601776 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397\": container with ID starting with c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397 not found: ID does not exist" containerID="c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397" Apr 16 14:03:14.601830 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601797 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397"} err="failed to get container status \"c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397\": rpc error: code = NotFound desc = could not find container \"c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397\": container with ID starting with c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397 not found: ID does not exist" Apr 16 14:03:14.601830 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601811 2564 scope.go:117] "RemoveContainer" containerID="7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5" Apr 16 14:03:14.601993 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601978 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5"} err="failed to get container status \"7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5\": rpc error: code = NotFound desc = could not find container \"7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5\": container with ID starting with 7093553f53edf38e4ee37089c25da273d44b26938d238d190df31dbf49eaa3e5 not found: ID does not exist" Apr 16 14:03:14.602037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.601995 2564 scope.go:117] "RemoveContainer" containerID="60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695" Apr 16 14:03:14.602197 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.602180 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695"} err="failed to get container status \"60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695\": rpc error: code = NotFound desc = could not find container \"60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695\": container with ID starting with 60b58130ad4a8fcbdcde599dec188fecbc210cec0179e717acbf4781effcb695 not found: ID does not exist" Apr 16 14:03:14.602246 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.602196 2564 scope.go:117] "RemoveContainer" containerID="9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083" Apr 16 14:03:14.602459 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.602438 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083"} err="failed to get container status \"9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083\": rpc error: code = NotFound desc = could not find container \"9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083\": container with ID starting with 9dcbb6ad35ffbfa59894e2ddb25eb30237376c132e9db5b5d64c52217c3c0083 not found: ID does not exist" Apr 16 14:03:14.602459 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.602459 2564 scope.go:117] "RemoveContainer" containerID="09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256" Apr 16 14:03:14.602691 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.602673 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256"} err="failed to get container status \"09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256\": rpc error: code = NotFound desc = could not find container \"09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256\": container with ID starting with 09ccb5c20ad9cb6b1ec7bd4fa9bcb33a3f1a9e5af79af1a0c27b810d04ff7256 not found: ID does not exist" Apr 16 14:03:14.602754 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.602692 2564 scope.go:117] "RemoveContainer" containerID="78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad" Apr 16 14:03:14.602924 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.602906 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad"} err="failed to get container status \"78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad\": rpc error: code = NotFound desc = could not find container \"78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad\": container with ID starting with 78de9c0863612320402f798153a92641f7e08e634e62097dc85078ca48ac20ad not found: ID does not exist" Apr 16 14:03:14.602975 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.602923 2564 scope.go:117] "RemoveContainer" containerID="44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73" Apr 16 14:03:14.603143 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.603123 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73"} err="failed to get container status \"44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73\": rpc error: code = NotFound desc = could not find container \"44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73\": container with ID starting with 44bb039586259c24d27088d89ba2d45e2f5c18b801528df422c68650e7052d73 not found: ID does not exist" Apr 16 14:03:14.603184 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.603145 2564 scope.go:117] "RemoveContainer" containerID="c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397" Apr 16 14:03:14.603365 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.603347 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397"} err="failed to get container status \"c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397\": rpc error: code = NotFound desc = could not find container \"c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397\": container with ID starting with c570bb25407a4ebb3cd28709c1f99016d27421fe5f96c30dc0d6feb4bf524397 not found: ID does not exist" Apr 16 14:03:14.680127 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680079 2564 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-web-config\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.680127 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680123 2564 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-config-volume\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.680127 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680135 2564 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.680127 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680145 2564 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-main-tls\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.680413 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680157 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdxkr\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-kube-api-access-qdxkr\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.680413 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680167 2564 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.680413 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680182 2564 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/daf6c5cc-575a-4930-8027-5139ac16f669-tls-assets\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.680413 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680191 2564 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.680413 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680225 2564 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/daf6c5cc-575a-4930-8027-5139ac16f669-cluster-tls-config\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.680413 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.680237 2564 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/daf6c5cc-575a-4930-8027-5139ac16f669-config-out\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:14.861092 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.861063 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:14.864955 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.864933 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:14.891782 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.891757 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:14.892091 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892075 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="alertmanager" Apr 16 14:03:14.892133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892095 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="alertmanager" Apr 16 14:03:14.892133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892106 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy" Apr 16 14:03:14.892133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892113 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy" Apr 16 14:03:14.892133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892120 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="prom-label-proxy" Apr 16 14:03:14.892133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892126 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="prom-label-proxy" Apr 16 14:03:14.892133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892133 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="config-reloader" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892139 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="config-reloader" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892147 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy-web" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892152 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy-web" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892167 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="init-config-reloader" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892172 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="init-config-reloader" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892179 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy-metric" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892185 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy-metric" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892245 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="prom-label-proxy" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892253 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="alertmanager" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892261 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy-metric" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892268 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy-web" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892274 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="config-reloader" Apr 16 14:03:14.892334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.892280 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" containerName="kube-rbac-proxy" Apr 16 14:03:14.898825 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.898808 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.901564 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.901542 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:03:14.901564 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.901552 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:03:14.901801 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.901547 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:03:14.901801 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.901598 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:03:14.901801 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.901611 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:03:14.901801 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.901680 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:03:14.901801 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.901789 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:03:14.902298 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.901789 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:03:14.902298 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.902174 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-n8jll\"" Apr 16 14:03:14.908329 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.908308 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:03:14.908774 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.908755 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:14.982130 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982079 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5bb5adca-62c0-4503-a4ce-eac24a26da84-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982130 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982137 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-web-config\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982438 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982226 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982438 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982280 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5bb5adca-62c0-4503-a4ce-eac24a26da84-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982438 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982319 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982438 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982346 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982438 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982369 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bb5adca-62c0-4503-a4ce-eac24a26da84-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982438 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982397 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982692 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982444 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5bb5adca-62c0-4503-a4ce-eac24a26da84-config-out\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982692 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982466 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982692 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982493 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5bb5adca-62c0-4503-a4ce-eac24a26da84-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982692 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982556 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-config-volume\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:14.982692 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:14.982583 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5882\" (UniqueName: \"kubernetes.io/projected/5bb5adca-62c0-4503-a4ce-eac24a26da84-kube-api-access-p5882\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.083938 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.083889 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-web-config\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.083938 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.083943 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084174 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.083964 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5bb5adca-62c0-4503-a4ce-eac24a26da84-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084174 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.083988 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084174 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084005 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084174 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084125 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bb5adca-62c0-4503-a4ce-eac24a26da84-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084174 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084168 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084582 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084249 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5bb5adca-62c0-4503-a4ce-eac24a26da84-config-out\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084582 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084284 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084582 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084326 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5bb5adca-62c0-4503-a4ce-eac24a26da84-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084582 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084383 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-config-volume\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084582 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084405 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5882\" (UniqueName: \"kubernetes.io/projected/5bb5adca-62c0-4503-a4ce-eac24a26da84-kube-api-access-p5882\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084582 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084447 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5bb5adca-62c0-4503-a4ce-eac24a26da84-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.084582 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.084496 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5bb5adca-62c0-4503-a4ce-eac24a26da84-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.085059 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.085028 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bb5adca-62c0-4503-a4ce-eac24a26da84-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.085120 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.085095 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5bb5adca-62c0-4503-a4ce-eac24a26da84-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.087134 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.087103 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.087276 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.087148 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-web-config\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.087276 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.087183 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.087400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.087280 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.087400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.087329 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5bb5adca-62c0-4503-a4ce-eac24a26da84-config-out\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.087621 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.087600 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-config-volume\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.087727 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.087713 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5bb5adca-62c0-4503-a4ce-eac24a26da84-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.088116 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.088100 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.089132 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.089107 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5bb5adca-62c0-4503-a4ce-eac24a26da84-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.096341 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.096293 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5882\" (UniqueName: \"kubernetes.io/projected/5bb5adca-62c0-4503-a4ce-eac24a26da84-kube-api-access-p5882\") pod \"alertmanager-main-0\" (UID: \"5bb5adca-62c0-4503-a4ce-eac24a26da84\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.209860 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.209830 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:15.339251 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.339196 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:15.343156 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:03:15.343130 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bb5adca_62c0_4503_a4ce_eac24a26da84.slice/crio-8451a8011222fe48b9b1fbcaae3455793be34fdf39c62366799e51a66b8c47e6 WatchSource:0}: Error finding container 8451a8011222fe48b9b1fbcaae3455793be34fdf39c62366799e51a66b8c47e6: Status 404 returned error can't find the container with id 8451a8011222fe48b9b1fbcaae3455793be34fdf39c62366799e51a66b8c47e6 Apr 16 14:03:15.543006 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.542972 2564 generic.go:358] "Generic (PLEG): container finished" podID="5bb5adca-62c0-4503-a4ce-eac24a26da84" containerID="64c9d4071c5ac6da04dc2fdb79383e19e4b7abca81d042db3944613ba45584e6" exitCode=0 Apr 16 14:03:15.543374 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.543030 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5bb5adca-62c0-4503-a4ce-eac24a26da84","Type":"ContainerDied","Data":"64c9d4071c5ac6da04dc2fdb79383e19e4b7abca81d042db3944613ba45584e6"} Apr 16 14:03:15.543374 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.543053 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5bb5adca-62c0-4503-a4ce-eac24a26da84","Type":"ContainerStarted","Data":"8451a8011222fe48b9b1fbcaae3455793be34fdf39c62366799e51a66b8c47e6"} Apr 16 14:03:15.776001 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:15.775973 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daf6c5cc-575a-4930-8027-5139ac16f669" path="/var/lib/kubelet/pods/daf6c5cc-575a-4930-8027-5139ac16f669/volumes" Apr 16 14:03:16.551469 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:16.551436 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5bb5adca-62c0-4503-a4ce-eac24a26da84","Type":"ContainerStarted","Data":"ca91b0600a075b3731dce131e072651434cc1da1d0d984e5ec3db7c4adcff675"} Apr 16 14:03:16.551469 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:16.551468 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5bb5adca-62c0-4503-a4ce-eac24a26da84","Type":"ContainerStarted","Data":"f752ec54bf480a5e42bbe0876f28957509da84267c4fd673a6a283056a55e15b"} Apr 16 14:03:16.551469 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:16.551478 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5bb5adca-62c0-4503-a4ce-eac24a26da84","Type":"ContainerStarted","Data":"6de4e9c8495c8d0dfecd51a011e9cd0ea195bdbb8688d212b1a745cfabf8a368"} Apr 16 14:03:16.552038 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:16.551488 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5bb5adca-62c0-4503-a4ce-eac24a26da84","Type":"ContainerStarted","Data":"b3a53d756c0eb4eac01bd5b3d3d70770b4b4f6d1d255561c2bb424ecb4a1f13d"} Apr 16 14:03:16.552038 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:16.551498 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5bb5adca-62c0-4503-a4ce-eac24a26da84","Type":"ContainerStarted","Data":"903a2c358e5008debf9b59ea403d6e38b2077a7480cefd4a0ab233f5438fd3bd"} Apr 16 14:03:16.552038 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:16.551506 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5bb5adca-62c0-4503-a4ce-eac24a26da84","Type":"ContainerStarted","Data":"9d6fec2d7d41e33bf59710fddaaeb82e2287654682c0233932494d751cc3441e"} Apr 16 14:03:16.579694 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:16.579645 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.579628765 podStartE2EDuration="2.579628765s" podCreationTimestamp="2026-04-16 14:03:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:03:16.578299941 +0000 UTC m=+233.397665312" watchObservedRunningTime="2026-04-16 14:03:16.579628765 +0000 UTC m=+233.398994118" Apr 16 14:03:17.519940 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:17.519904 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:17.520577 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:17.520506 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="prometheus" containerID="cri-o://9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" gracePeriod=600 Apr 16 14:03:17.520577 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:17.520564 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy-thanos" containerID="cri-o://60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" gracePeriod=600 Apr 16 14:03:17.520769 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:17.520680 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="thanos-sidecar" containerID="cri-o://dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" gracePeriod=600 Apr 16 14:03:17.520769 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:17.520715 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy" containerID="cri-o://82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" gracePeriod=600 Apr 16 14:03:17.520879 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:17.520780 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy-web" containerID="cri-o://2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" gracePeriod=600 Apr 16 14:03:17.520930 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:17.520906 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="config-reloader" containerID="cri-o://7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" gracePeriod=600 Apr 16 14:03:17.887356 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:17.887330 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.011792 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.011759 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-config\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.011959 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.011800 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-serving-certs-ca-bundle\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.011959 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.011855 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-rulefiles-0\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.011959 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.011884 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-trusted-ca-bundle\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.011959 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.011906 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-thanos-prometheus-http-client-file\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.011959 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.011934 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-metrics-client-ca\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.011964 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-grpc-tls\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.011998 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-tls-assets\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012026 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-metrics-client-certs\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012058 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4r57\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-kube-api-access-b4r57\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012089 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-web-config\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012120 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-kube-rbac-proxy\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012183 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-kubelet-serving-ca-bundle\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012254 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012227 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012266 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-db\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012297 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-tls\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012330 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-config-out\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012369 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"57a9dd19-6abe-420f-9d15-7618336ffece\" (UID: \"57a9dd19-6abe-420f-9d15-7618336ffece\") " Apr 16 14:03:18.012608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012396 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:18.012840 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012631 2564 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.012840 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012775 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:18.012840 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.012783 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:18.013442 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.013411 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:18.014762 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.014732 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:18.015222 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.015170 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:18.015320 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.015279 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-config" (OuterVolumeSpecName: "config") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:18.015660 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.015613 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:18.015730 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.015696 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:18.016112 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.016074 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:18.016426 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.016405 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:18.016512 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.016460 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:18.016940 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.016811 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:18.016940 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.016906 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-kube-api-access-b4r57" (OuterVolumeSpecName: "kube-api-access-b4r57") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "kube-api-access-b4r57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:18.017605 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.017574 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:18.017804 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.017782 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:18.018159 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.018139 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-config-out" (OuterVolumeSpecName: "config-out") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:18.027009 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.026987 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-web-config" (OuterVolumeSpecName: "web-config") pod "57a9dd19-6abe-420f-9d15-7618336ffece" (UID: "57a9dd19-6abe-420f-9d15-7618336ffece"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:18.113361 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113322 2564 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113361 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113356 2564 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113361 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113369 2564 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-metrics-client-ca\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113379 2564 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-grpc-tls\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113388 2564 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-tls-assets\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113397 2564 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-metrics-client-certs\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113406 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b4r57\" (UniqueName: \"kubernetes.io/projected/57a9dd19-6abe-420f-9d15-7618336ffece-kube-api-access-b4r57\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113415 2564 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-web-config\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113424 2564 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-kube-rbac-proxy\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113432 2564 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113442 2564 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113451 2564 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-db\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113460 2564 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113469 2564 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/57a9dd19-6abe-420f-9d15-7618336ffece-config-out\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113477 2564 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113487 2564 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/57a9dd19-6abe-420f-9d15-7618336ffece-config\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.113583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.113498 2564 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/57a9dd19-6abe-420f-9d15-7618336ffece-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:03:18.562075 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562035 2564 generic.go:358] "Generic (PLEG): container finished" podID="57a9dd19-6abe-420f-9d15-7618336ffece" containerID="60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" exitCode=0 Apr 16 14:03:18.562075 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562063 2564 generic.go:358] "Generic (PLEG): container finished" podID="57a9dd19-6abe-420f-9d15-7618336ffece" containerID="82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" exitCode=0 Apr 16 14:03:18.562075 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562072 2564 generic.go:358] "Generic (PLEG): container finished" podID="57a9dd19-6abe-420f-9d15-7618336ffece" containerID="2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" exitCode=0 Apr 16 14:03:18.562075 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562078 2564 generic.go:358] "Generic (PLEG): container finished" podID="57a9dd19-6abe-420f-9d15-7618336ffece" containerID="dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" exitCode=0 Apr 16 14:03:18.562075 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562083 2564 generic.go:358] "Generic (PLEG): container finished" podID="57a9dd19-6abe-420f-9d15-7618336ffece" containerID="7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" exitCode=0 Apr 16 14:03:18.562075 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562088 2564 generic.go:358] "Generic (PLEG): container finished" podID="57a9dd19-6abe-420f-9d15-7618336ffece" containerID="9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" exitCode=0 Apr 16 14:03:18.562400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562124 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerDied","Data":"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506"} Apr 16 14:03:18.562400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562156 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.562400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562172 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerDied","Data":"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda"} Apr 16 14:03:18.562400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562188 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerDied","Data":"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4"} Apr 16 14:03:18.562400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562225 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerDied","Data":"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b"} Apr 16 14:03:18.562400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562233 2564 scope.go:117] "RemoveContainer" containerID="60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" Apr 16 14:03:18.562400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562240 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerDied","Data":"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97"} Apr 16 14:03:18.562400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562254 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerDied","Data":"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9"} Apr 16 14:03:18.562400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.562267 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57a9dd19-6abe-420f-9d15-7618336ffece","Type":"ContainerDied","Data":"7679439f1fb0a968e0a78584647989eabcd3a147ce2788d74e0884c7fccfca43"} Apr 16 14:03:18.571896 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.571871 2564 scope.go:117] "RemoveContainer" containerID="82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" Apr 16 14:03:18.579047 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.579030 2564 scope.go:117] "RemoveContainer" containerID="2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" Apr 16 14:03:18.585614 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.585595 2564 scope.go:117] "RemoveContainer" containerID="dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" Apr 16 14:03:18.589622 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.589596 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:18.593055 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.593037 2564 scope.go:117] "RemoveContainer" containerID="7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" Apr 16 14:03:18.593166 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.593140 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:18.600072 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.600052 2564 scope.go:117] "RemoveContainer" containerID="9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" Apr 16 14:03:18.607113 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.607084 2564 scope.go:117] "RemoveContainer" containerID="2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5" Apr 16 14:03:18.613933 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.613911 2564 scope.go:117] "RemoveContainer" containerID="60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" Apr 16 14:03:18.614198 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:18.614179 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": container with ID starting with 60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506 not found: ID does not exist" containerID="60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" Apr 16 14:03:18.614314 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.614292 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506"} err="failed to get container status \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": rpc error: code = NotFound desc = could not find container \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": container with ID starting with 60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506 not found: ID does not exist" Apr 16 14:03:18.614383 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.614316 2564 scope.go:117] "RemoveContainer" containerID="82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" Apr 16 14:03:18.614569 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:18.614551 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": container with ID starting with 82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda not found: ID does not exist" containerID="82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" Apr 16 14:03:18.614607 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.614576 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda"} err="failed to get container status \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": rpc error: code = NotFound desc = could not find container \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": container with ID starting with 82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda not found: ID does not exist" Apr 16 14:03:18.614607 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.614592 2564 scope.go:117] "RemoveContainer" containerID="2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" Apr 16 14:03:18.614810 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:18.614786 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": container with ID starting with 2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4 not found: ID does not exist" containerID="2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" Apr 16 14:03:18.614855 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.614818 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4"} err="failed to get container status \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": rpc error: code = NotFound desc = could not find container \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": container with ID starting with 2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4 not found: ID does not exist" Apr 16 14:03:18.614855 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.614831 2564 scope.go:117] "RemoveContainer" containerID="dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" Apr 16 14:03:18.615071 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:18.615050 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": container with ID starting with dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b not found: ID does not exist" containerID="dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" Apr 16 14:03:18.615139 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.615081 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b"} err="failed to get container status \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": rpc error: code = NotFound desc = could not find container \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": container with ID starting with dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b not found: ID does not exist" Apr 16 14:03:18.615139 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.615104 2564 scope.go:117] "RemoveContainer" containerID="7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" Apr 16 14:03:18.615509 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:18.615408 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": container with ID starting with 7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97 not found: ID does not exist" containerID="7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" Apr 16 14:03:18.615509 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.615457 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97"} err="failed to get container status \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": rpc error: code = NotFound desc = could not find container \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": container with ID starting with 7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97 not found: ID does not exist" Apr 16 14:03:18.615509 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.615478 2564 scope.go:117] "RemoveContainer" containerID="9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" Apr 16 14:03:18.615980 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:18.615901 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": container with ID starting with 9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9 not found: ID does not exist" containerID="9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" Apr 16 14:03:18.615980 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.615938 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9"} err="failed to get container status \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": rpc error: code = NotFound desc = could not find container \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": container with ID starting with 9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9 not found: ID does not exist" Apr 16 14:03:18.615980 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.615956 2564 scope.go:117] "RemoveContainer" containerID="2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5" Apr 16 14:03:18.616268 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:03:18.616219 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": container with ID starting with 2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5 not found: ID does not exist" containerID="2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5" Apr 16 14:03:18.616268 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.616260 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5"} err="failed to get container status \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": rpc error: code = NotFound desc = could not find container \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": container with ID starting with 2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5 not found: ID does not exist" Apr 16 14:03:18.616406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.616281 2564 scope.go:117] "RemoveContainer" containerID="60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" Apr 16 14:03:18.616574 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.616557 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506"} err="failed to get container status \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": rpc error: code = NotFound desc = could not find container \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": container with ID starting with 60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506 not found: ID does not exist" Apr 16 14:03:18.616638 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.616582 2564 scope.go:117] "RemoveContainer" containerID="82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" Apr 16 14:03:18.616919 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.616870 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda"} err="failed to get container status \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": rpc error: code = NotFound desc = could not find container \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": container with ID starting with 82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda not found: ID does not exist" Apr 16 14:03:18.616919 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.616911 2564 scope.go:117] "RemoveContainer" containerID="2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" Apr 16 14:03:18.617147 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.617125 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4"} err="failed to get container status \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": rpc error: code = NotFound desc = could not find container \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": container with ID starting with 2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4 not found: ID does not exist" Apr 16 14:03:18.617252 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.617148 2564 scope.go:117] "RemoveContainer" containerID="dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" Apr 16 14:03:18.617434 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.617411 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b"} err="failed to get container status \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": rpc error: code = NotFound desc = could not find container \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": container with ID starting with dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b not found: ID does not exist" Apr 16 14:03:18.617489 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.617439 2564 scope.go:117] "RemoveContainer" containerID="7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" Apr 16 14:03:18.617697 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.617672 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97"} err="failed to get container status \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": rpc error: code = NotFound desc = could not find container \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": container with ID starting with 7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97 not found: ID does not exist" Apr 16 14:03:18.617697 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.617697 2564 scope.go:117] "RemoveContainer" containerID="9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" Apr 16 14:03:18.617865 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.617842 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:18.617942 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.617923 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9"} err="failed to get container status \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": rpc error: code = NotFound desc = could not find container \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": container with ID starting with 9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9 not found: ID does not exist" Apr 16 14:03:18.617987 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.617942 2564 scope.go:117] "RemoveContainer" containerID="2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5" Apr 16 14:03:18.618158 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618143 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5"} err="failed to get container status \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": rpc error: code = NotFound desc = could not find container \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": container with ID starting with 2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5 not found: ID does not exist" Apr 16 14:03:18.618158 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618158 2564 scope.go:117] "RemoveContainer" containerID="60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" Apr 16 14:03:18.618268 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618216 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="prometheus" Apr 16 14:03:18.618268 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618233 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="prometheus" Apr 16 14:03:18.618268 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618248 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy" Apr 16 14:03:18.618268 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618257 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy" Apr 16 14:03:18.618268 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618267 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="thanos-sidecar" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618275 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="thanos-sidecar" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618286 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy-web" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618294 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy-web" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618306 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="config-reloader" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618314 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="config-reloader" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618334 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy-thanos" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618343 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy-thanos" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618352 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="init-config-reloader" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618357 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="init-config-reloader" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618387 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506"} err="failed to get container status \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": rpc error: code = NotFound desc = could not find container \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": container with ID starting with 60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506 not found: ID does not exist" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618416 2564 scope.go:117] "RemoveContainer" containerID="82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618422 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="config-reloader" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618435 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy-thanos" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618447 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="prometheus" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618459 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618469 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="thanos-sidecar" Apr 16 14:03:18.618473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618479 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" containerName="kube-rbac-proxy-web" Apr 16 14:03:18.619123 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618633 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda"} err="failed to get container status \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": rpc error: code = NotFound desc = could not find container \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": container with ID starting with 82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda not found: ID does not exist" Apr 16 14:03:18.619123 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618652 2564 scope.go:117] "RemoveContainer" containerID="2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" Apr 16 14:03:18.619123 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618828 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4"} err="failed to get container status \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": rpc error: code = NotFound desc = could not find container \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": container with ID starting with 2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4 not found: ID does not exist" Apr 16 14:03:18.619123 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.618842 2564 scope.go:117] "RemoveContainer" containerID="dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" Apr 16 14:03:18.619123 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.619016 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b"} err="failed to get container status \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": rpc error: code = NotFound desc = could not find container \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": container with ID starting with dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b not found: ID does not exist" Apr 16 14:03:18.619123 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.619037 2564 scope.go:117] "RemoveContainer" containerID="7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" Apr 16 14:03:18.619349 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.619249 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97"} err="failed to get container status \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": rpc error: code = NotFound desc = could not find container \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": container with ID starting with 7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97 not found: ID does not exist" Apr 16 14:03:18.619349 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.619261 2564 scope.go:117] "RemoveContainer" containerID="9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" Apr 16 14:03:18.619757 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.619734 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9"} err="failed to get container status \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": rpc error: code = NotFound desc = could not find container \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": container with ID starting with 9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9 not found: ID does not exist" Apr 16 14:03:18.619757 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.619756 2564 scope.go:117] "RemoveContainer" containerID="2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5" Apr 16 14:03:18.620015 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.619992 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5"} err="failed to get container status \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": rpc error: code = NotFound desc = could not find container \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": container with ID starting with 2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5 not found: ID does not exist" Apr 16 14:03:18.620015 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.620014 2564 scope.go:117] "RemoveContainer" containerID="60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" Apr 16 14:03:18.620192 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.620174 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506"} err="failed to get container status \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": rpc error: code = NotFound desc = could not find container \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": container with ID starting with 60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506 not found: ID does not exist" Apr 16 14:03:18.620192 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.620192 2564 scope.go:117] "RemoveContainer" containerID="82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" Apr 16 14:03:18.620423 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.620406 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda"} err="failed to get container status \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": rpc error: code = NotFound desc = could not find container \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": container with ID starting with 82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda not found: ID does not exist" Apr 16 14:03:18.620423 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.620421 2564 scope.go:117] "RemoveContainer" containerID="2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" Apr 16 14:03:18.620653 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.620620 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4"} err="failed to get container status \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": rpc error: code = NotFound desc = could not find container \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": container with ID starting with 2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4 not found: ID does not exist" Apr 16 14:03:18.620653 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.620643 2564 scope.go:117] "RemoveContainer" containerID="dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" Apr 16 14:03:18.620890 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.620870 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b"} err="failed to get container status \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": rpc error: code = NotFound desc = could not find container \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": container with ID starting with dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b not found: ID does not exist" Apr 16 14:03:18.620890 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.620888 2564 scope.go:117] "RemoveContainer" containerID="7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" Apr 16 14:03:18.621196 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.621107 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97"} err="failed to get container status \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": rpc error: code = NotFound desc = could not find container \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": container with ID starting with 7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97 not found: ID does not exist" Apr 16 14:03:18.621196 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.621132 2564 scope.go:117] "RemoveContainer" containerID="9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" Apr 16 14:03:18.621681 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.621595 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9"} err="failed to get container status \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": rpc error: code = NotFound desc = could not find container \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": container with ID starting with 9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9 not found: ID does not exist" Apr 16 14:03:18.621681 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.621619 2564 scope.go:117] "RemoveContainer" containerID="2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5" Apr 16 14:03:18.622078 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.621994 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5"} err="failed to get container status \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": rpc error: code = NotFound desc = could not find container \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": container with ID starting with 2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5 not found: ID does not exist" Apr 16 14:03:18.622078 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.622017 2564 scope.go:117] "RemoveContainer" containerID="60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" Apr 16 14:03:18.622545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.622466 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506"} err="failed to get container status \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": rpc error: code = NotFound desc = could not find container \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": container with ID starting with 60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506 not found: ID does not exist" Apr 16 14:03:18.622545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.622487 2564 scope.go:117] "RemoveContainer" containerID="82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" Apr 16 14:03:18.622920 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.622894 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda"} err="failed to get container status \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": rpc error: code = NotFound desc = could not find container \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": container with ID starting with 82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda not found: ID does not exist" Apr 16 14:03:18.622920 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.622920 2564 scope.go:117] "RemoveContainer" containerID="2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" Apr 16 14:03:18.623239 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.623193 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4"} err="failed to get container status \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": rpc error: code = NotFound desc = could not find container \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": container with ID starting with 2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4 not found: ID does not exist" Apr 16 14:03:18.623239 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.623238 2564 scope.go:117] "RemoveContainer" containerID="dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" Apr 16 14:03:18.623548 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.623529 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b"} err="failed to get container status \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": rpc error: code = NotFound desc = could not find container \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": container with ID starting with dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b not found: ID does not exist" Apr 16 14:03:18.623548 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.623547 2564 scope.go:117] "RemoveContainer" containerID="7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" Apr 16 14:03:18.623778 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.623760 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97"} err="failed to get container status \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": rpc error: code = NotFound desc = could not find container \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": container with ID starting with 7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97 not found: ID does not exist" Apr 16 14:03:18.623847 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.623784 2564 scope.go:117] "RemoveContainer" containerID="9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" Apr 16 14:03:18.624013 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.623991 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9"} err="failed to get container status \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": rpc error: code = NotFound desc = could not find container \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": container with ID starting with 9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9 not found: ID does not exist" Apr 16 14:03:18.624075 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.624020 2564 scope.go:117] "RemoveContainer" containerID="2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5" Apr 16 14:03:18.624230 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.624187 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5"} err="failed to get container status \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": rpc error: code = NotFound desc = could not find container \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": container with ID starting with 2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5 not found: ID does not exist" Apr 16 14:03:18.624230 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.624228 2564 scope.go:117] "RemoveContainer" containerID="60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506" Apr 16 14:03:18.624442 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.624426 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506"} err="failed to get container status \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": rpc error: code = NotFound desc = could not find container \"60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506\": container with ID starting with 60c2fc4bafaa31b8220f81512266eda9fc61bceda904d7c41357567df8889506 not found: ID does not exist" Apr 16 14:03:18.624488 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.624443 2564 scope.go:117] "RemoveContainer" containerID="82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda" Apr 16 14:03:18.624652 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.624634 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda"} err="failed to get container status \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": rpc error: code = NotFound desc = could not find container \"82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda\": container with ID starting with 82f344cb8331d255da10f768f73145e50aad54808f3b0633eefadf0211d5dcda not found: ID does not exist" Apr 16 14:03:18.624702 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.624652 2564 scope.go:117] "RemoveContainer" containerID="2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4" Apr 16 14:03:18.624909 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.624889 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4"} err="failed to get container status \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": rpc error: code = NotFound desc = could not find container \"2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4\": container with ID starting with 2165ffbc4c38bf484c43ee4172266b524692ba6c1681568b8059364afbec95d4 not found: ID does not exist" Apr 16 14:03:18.624956 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.624910 2564 scope.go:117] "RemoveContainer" containerID="dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b" Apr 16 14:03:18.625101 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.625086 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b"} err="failed to get container status \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": rpc error: code = NotFound desc = could not find container \"dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b\": container with ID starting with dccd2a0fe72f7a0e0722a5dd77eb81cd5661b3af130283e4a11f4eb981048a3b not found: ID does not exist" Apr 16 14:03:18.625101 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.625099 2564 scope.go:117] "RemoveContainer" containerID="7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97" Apr 16 14:03:18.625334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.625316 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97"} err="failed to get container status \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": rpc error: code = NotFound desc = could not find container \"7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97\": container with ID starting with 7e638434c330f92959411c349a8040ac7186353fe53a19ccb554e3e7ac7f1c97 not found: ID does not exist" Apr 16 14:03:18.625384 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.625335 2564 scope.go:117] "RemoveContainer" containerID="9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9" Apr 16 14:03:18.625547 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.625530 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9"} err="failed to get container status \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": rpc error: code = NotFound desc = could not find container \"9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9\": container with ID starting with 9414b162a46ba317f281a4694a4703bfd51c1ef02b0890780c7fb8a985a280a9 not found: ID does not exist" Apr 16 14:03:18.625596 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.625548 2564 scope.go:117] "RemoveContainer" containerID="2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5" Apr 16 14:03:18.625746 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.625731 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5"} err="failed to get container status \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": rpc error: code = NotFound desc = could not find container \"2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5\": container with ID starting with 2772fbac6c269b8e0bc4f2ea9b34fde31d9b63e859cc22da8de0d3173b1d44d5 not found: ID does not exist" Apr 16 14:03:18.625917 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.625902 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.628594 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.628574 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:03:18.628688 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.628581 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:03:18.628865 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.628840 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:03:18.628929 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.628862 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:03:18.628986 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.628923 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:03:18.629100 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.629082 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:03:18.629163 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.629134 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:03:18.629521 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.629503 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:03:18.629619 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.629573 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-vwv82\"" Apr 16 14:03:18.629935 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.629801 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:03:18.629935 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.629825 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:03:18.629935 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.629837 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ff96megmomqat\"" Apr 16 14:03:18.631094 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.631076 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:03:18.635038 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.634860 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:03:18.636335 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.636314 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:18.717921 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.717888 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-config-out\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.717921 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.717925 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.717945 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.717971 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.717998 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-web-config\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718027 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718047 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718073 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718100 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-config\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718113 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718133 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718246 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718274 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718313 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718336 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718356 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718404 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.718478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.718425 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzqb9\" (UniqueName: \"kubernetes.io/projected/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-kube-api-access-zzqb9\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819119 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819030 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819119 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819068 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819119 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819085 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819403 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819154 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-web-config\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819403 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819239 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819403 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819285 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819403 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819324 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819439 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-config\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819489 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819524 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819568 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819599 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819882 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819640 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819882 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819664 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819882 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819692 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819882 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819741 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819882 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819765 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzqb9\" (UniqueName: \"kubernetes.io/projected/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-kube-api-access-zzqb9\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.819882 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819824 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-config-out\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.820180 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.819922 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.820180 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.820092 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.821033 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.820986 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.821839 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.821236 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.821839 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.821800 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.823252 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.822704 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-web-config\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.823252 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.823148 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.824184 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.823870 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.824184 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.824149 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.824363 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.824243 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.824363 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.824309 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.824472 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.824423 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.824696 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.824676 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-config\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.824748 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.824738 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.825110 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.825092 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.825148 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.825098 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-config-out\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.825258 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.825242 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.828698 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.828680 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzqb9\" (UniqueName: \"kubernetes.io/projected/14b8d935-5c9b-43fb-8e82-9ae3fa73f51a-kube-api-access-zzqb9\") pod \"prometheus-k8s-0\" (UID: \"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:18.937431 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:18.937398 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:19.069441 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:19.069414 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:19.071628 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:03:19.071603 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b8d935_5c9b_43fb_8e82_9ae3fa73f51a.slice/crio-ce7dc7bfffd7ad3f05e9e7cd64c2d3d28a3c9bd6fc127e6d3cd14904872e65ef WatchSource:0}: Error finding container ce7dc7bfffd7ad3f05e9e7cd64c2d3d28a3c9bd6fc127e6d3cd14904872e65ef: Status 404 returned error can't find the container with id ce7dc7bfffd7ad3f05e9e7cd64c2d3d28a3c9bd6fc127e6d3cd14904872e65ef Apr 16 14:03:19.567196 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:19.567159 2564 generic.go:358] "Generic (PLEG): container finished" podID="14b8d935-5c9b-43fb-8e82-9ae3fa73f51a" containerID="cca2f7ae0b7244f787211ac07f055ee962815d1b23e653ea9d7fc5eb9b3e7085" exitCode=0 Apr 16 14:03:19.567377 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:19.567225 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a","Type":"ContainerDied","Data":"cca2f7ae0b7244f787211ac07f055ee962815d1b23e653ea9d7fc5eb9b3e7085"} Apr 16 14:03:19.567377 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:19.567252 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a","Type":"ContainerStarted","Data":"ce7dc7bfffd7ad3f05e9e7cd64c2d3d28a3c9bd6fc127e6d3cd14904872e65ef"} Apr 16 14:03:19.777325 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:19.777264 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a9dd19-6abe-420f-9d15-7618336ffece" path="/var/lib/kubelet/pods/57a9dd19-6abe-420f-9d15-7618336ffece/volumes" Apr 16 14:03:20.573357 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:20.573323 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a","Type":"ContainerStarted","Data":"c2caaed01a59f2a53df9f3f9f677cb7c509e7f879df765e46b2b2aa6d1b56de9"} Apr 16 14:03:20.573357 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:20.573358 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a","Type":"ContainerStarted","Data":"80188247ee1f8bf8957c3fe0cb97f0cecd4e2cd03b3cbbe0112661100442581a"} Apr 16 14:03:20.573357 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:20.573367 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a","Type":"ContainerStarted","Data":"f5f003df71367e87ee4488d9cd09a6289aa95315a7bf7bb58d04bec8eecd50d3"} Apr 16 14:03:20.573775 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:20.573376 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a","Type":"ContainerStarted","Data":"8056128173fc6625b18ea7bf4ea32ab8551247b1be98cc5dcacd49f9e5930acf"} Apr 16 14:03:20.573775 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:20.573384 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a","Type":"ContainerStarted","Data":"002bc11043089c1f67c3d062cf3369d999a7105511c483e203b8910ea60986cc"} Apr 16 14:03:20.573775 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:20.573392 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14b8d935-5c9b-43fb-8e82-9ae3fa73f51a","Type":"ContainerStarted","Data":"48dedbcc4206ed713fdfade79bd2b3f0d3da2f42d4dc9c6893e0a3f23e9c349a"} Apr 16 14:03:20.602518 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:20.602467 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.6024525560000002 podStartE2EDuration="2.602452556s" podCreationTimestamp="2026-04-16 14:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:03:20.601624784 +0000 UTC m=+237.420990163" watchObservedRunningTime="2026-04-16 14:03:20.602452556 +0000 UTC m=+237.421817905" Apr 16 14:03:23.938271 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:03:23.938139 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:18.937868 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:04:18.937819 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:18.953261 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:04:18.953237 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:19.760160 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:04:19.760134 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:23.688752 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:04:23.688724 2564 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:07:02.855065 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:02.855031 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-599gh"] Apr 16 14:07:02.858507 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:02.858491 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" Apr 16 14:07:02.860902 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:02.860879 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 14:07:02.861753 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:02.861735 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-kp69x\"" Apr 16 14:07:02.861836 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:02.861758 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 14:07:02.866522 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:02.866502 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-599gh"] Apr 16 14:07:03.025319 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:03.025284 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cc7562fa-742c-4d2f-afe5-18ca9f069833-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-599gh\" (UID: \"cc7562fa-742c-4d2f-afe5-18ca9f069833\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" Apr 16 14:07:03.025491 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:03.025327 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k25w9\" (UniqueName: \"kubernetes.io/projected/cc7562fa-742c-4d2f-afe5-18ca9f069833-kube-api-access-k25w9\") pod \"cert-manager-cainjector-8966b78d4-599gh\" (UID: \"cc7562fa-742c-4d2f-afe5-18ca9f069833\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" Apr 16 14:07:03.126560 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:03.126464 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cc7562fa-742c-4d2f-afe5-18ca9f069833-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-599gh\" (UID: \"cc7562fa-742c-4d2f-afe5-18ca9f069833\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" Apr 16 14:07:03.126560 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:03.126517 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k25w9\" (UniqueName: \"kubernetes.io/projected/cc7562fa-742c-4d2f-afe5-18ca9f069833-kube-api-access-k25w9\") pod \"cert-manager-cainjector-8966b78d4-599gh\" (UID: \"cc7562fa-742c-4d2f-afe5-18ca9f069833\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" Apr 16 14:07:03.135005 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:03.134966 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cc7562fa-742c-4d2f-afe5-18ca9f069833-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-599gh\" (UID: \"cc7562fa-742c-4d2f-afe5-18ca9f069833\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" Apr 16 14:07:03.135121 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:03.135067 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k25w9\" (UniqueName: \"kubernetes.io/projected/cc7562fa-742c-4d2f-afe5-18ca9f069833-kube-api-access-k25w9\") pod \"cert-manager-cainjector-8966b78d4-599gh\" (UID: \"cc7562fa-742c-4d2f-afe5-18ca9f069833\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" Apr 16 14:07:03.178439 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:03.178414 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" Apr 16 14:07:03.293456 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:03.293350 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-599gh"] Apr 16 14:07:03.296533 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:07:03.296503 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc7562fa_742c_4d2f_afe5_18ca9f069833.slice/crio-09f4b1893286e6ebf21f33e064d336617f392e1d15202896fdf2277770a66d4a WatchSource:0}: Error finding container 09f4b1893286e6ebf21f33e064d336617f392e1d15202896fdf2277770a66d4a: Status 404 returned error can't find the container with id 09f4b1893286e6ebf21f33e064d336617f392e1d15202896fdf2277770a66d4a Apr 16 14:07:03.298385 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:03.298370 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:07:04.217624 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:04.217587 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" event={"ID":"cc7562fa-742c-4d2f-afe5-18ca9f069833","Type":"ContainerStarted","Data":"09f4b1893286e6ebf21f33e064d336617f392e1d15202896fdf2277770a66d4a"} Apr 16 14:07:07.227983 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:07.227939 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" event={"ID":"cc7562fa-742c-4d2f-afe5-18ca9f069833","Type":"ContainerStarted","Data":"035856d19dbf96f447f0f10a4461bd69fc245cf0e4520e0e9fc0e025d50194d4"} Apr 16 14:07:07.245000 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:07.244808 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-599gh" podStartSLOduration=1.851648746 podStartE2EDuration="5.244789595s" podCreationTimestamp="2026-04-16 14:07:02 +0000 UTC" firstStartedPulling="2026-04-16 14:07:03.298491749 +0000 UTC m=+460.117857082" lastFinishedPulling="2026-04-16 14:07:06.691632599 +0000 UTC m=+463.510997931" observedRunningTime="2026-04-16 14:07:07.244584406 +0000 UTC m=+464.063949760" watchObservedRunningTime="2026-04-16 14:07:07.244789595 +0000 UTC m=+464.064154948" Apr 16 14:07:45.955485 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:45.955418 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4"] Apr 16 14:07:45.959148 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:45.959124 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:45.962015 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:45.961995 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-9mh4m\"" Apr 16 14:07:45.962617 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:45.962590 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 14:07:45.962735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:45.962623 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 14:07:45.962735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:45.962638 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 14:07:45.962735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:45.962678 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 14:07:45.962735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:45.962624 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:07:45.970175 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:45.970153 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4"] Apr 16 14:07:46.062139 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.062104 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58kg\" (UniqueName: \"kubernetes.io/projected/80384fab-34a9-48a0-b240-612d519c2f86-kube-api-access-r58kg\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.062310 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.062156 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/80384fab-34a9-48a0-b240-612d519c2f86-manager-config\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.062310 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.062247 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/80384fab-34a9-48a0-b240-612d519c2f86-metrics-cert\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.062381 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.062320 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80384fab-34a9-48a0-b240-612d519c2f86-cert\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.162892 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.162854 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80384fab-34a9-48a0-b240-612d519c2f86-cert\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.162892 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.162892 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r58kg\" (UniqueName: \"kubernetes.io/projected/80384fab-34a9-48a0-b240-612d519c2f86-kube-api-access-r58kg\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.163098 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.162934 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/80384fab-34a9-48a0-b240-612d519c2f86-manager-config\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.163098 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.162967 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/80384fab-34a9-48a0-b240-612d519c2f86-metrics-cert\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.163675 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.163650 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/80384fab-34a9-48a0-b240-612d519c2f86-manager-config\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.165358 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.165336 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80384fab-34a9-48a0-b240-612d519c2f86-cert\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.165445 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.165411 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/80384fab-34a9-48a0-b240-612d519c2f86-metrics-cert\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.175195 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.175163 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58kg\" (UniqueName: \"kubernetes.io/projected/80384fab-34a9-48a0-b240-612d519c2f86-kube-api-access-r58kg\") pod \"lws-controller-manager-5b8748f956-qntz4\" (UID: \"80384fab-34a9-48a0-b240-612d519c2f86\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.269017 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.268934 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:46.403248 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:46.402955 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4"] Apr 16 14:07:46.404044 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:07:46.403648 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80384fab_34a9_48a0_b240_612d519c2f86.slice/crio-88042c2f52d3fba8a357fb54754b37c0cb7c2faac8eaf269676bad7dd8aa0114 WatchSource:0}: Error finding container 88042c2f52d3fba8a357fb54754b37c0cb7c2faac8eaf269676bad7dd8aa0114: Status 404 returned error can't find the container with id 88042c2f52d3fba8a357fb54754b37c0cb7c2faac8eaf269676bad7dd8aa0114 Apr 16 14:07:47.351347 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:47.351304 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" event={"ID":"80384fab-34a9-48a0-b240-612d519c2f86","Type":"ContainerStarted","Data":"88042c2f52d3fba8a357fb54754b37c0cb7c2faac8eaf269676bad7dd8aa0114"} Apr 16 14:07:49.358820 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:49.358785 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" event={"ID":"80384fab-34a9-48a0-b240-612d519c2f86","Type":"ContainerStarted","Data":"94132e9eb12806087820119ac3681b3aaff97282a6eb0feef41a09322b7d7e34"} Apr 16 14:07:49.359186 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:49.358847 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:07:49.378244 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:07:49.378168 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" podStartSLOduration=1.6267562789999999 podStartE2EDuration="4.378156001s" podCreationTimestamp="2026-04-16 14:07:45 +0000 UTC" firstStartedPulling="2026-04-16 14:07:46.405619997 +0000 UTC m=+503.224985330" lastFinishedPulling="2026-04-16 14:07:49.157019708 +0000 UTC m=+505.976385052" observedRunningTime="2026-04-16 14:07:49.377156743 +0000 UTC m=+506.196522096" watchObservedRunningTime="2026-04-16 14:07:49.378156001 +0000 UTC m=+506.197521353" Apr 16 14:08:00.364161 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:00.364126 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-qntz4" Apr 16 14:08:01.333154 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.333102 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7"] Apr 16 14:08:01.336742 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.336719 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.339305 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.339283 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-r6jp4\"" Apr 16 14:08:01.339305 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.339295 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 14:08:01.346975 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.346948 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7"] Apr 16 14:08:01.495478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.495386 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3bb5710b-5550-4097-9967-7df2e8c5b723-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.495478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.495424 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.495933 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.495493 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.495933 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.495560 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.495933 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.495602 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.495933 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.495638 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.495933 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.495697 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.495933 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.495763 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.495933 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.495781 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9kl\" (UniqueName: \"kubernetes.io/projected/3bb5710b-5550-4097-9967-7df2e8c5b723-kube-api-access-hc9kl\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.596561 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.596527 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.596743 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.596577 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.596743 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.596613 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.596743 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.596647 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.596743 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.596682 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.596743 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.596708 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.597010 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.596755 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.597010 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.596776 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9kl\" (UniqueName: \"kubernetes.io/projected/3bb5710b-5550-4097-9967-7df2e8c5b723-kube-api-access-hc9kl\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.597010 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.596818 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3bb5710b-5550-4097-9967-7df2e8c5b723-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.597156 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.597075 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.597156 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.597096 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.597295 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.597163 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.597295 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.597229 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.597402 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.597385 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3bb5710b-5550-4097-9967-7df2e8c5b723-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.599056 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.599034 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.599301 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.599283 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.604539 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.604516 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9kl\" (UniqueName: \"kubernetes.io/projected/3bb5710b-5550-4097-9967-7df2e8c5b723-kube-api-access-hc9kl\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.604617 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.604581 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3bb5710b-5550-4097-9967-7df2e8c5b723-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4npv7\" (UID: \"3bb5710b-5550-4097-9967-7df2e8c5b723\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.649145 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.649102 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:01.774542 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:01.774512 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7"] Apr 16 14:08:01.774647 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:08:01.774618 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb5710b_5550_4097_9967_7df2e8c5b723.slice/crio-2c2c3032b7b24b1a47c91b38acb6f084ec5b91fbbd9c070227702b5a8f26a1c7 WatchSource:0}: Error finding container 2c2c3032b7b24b1a47c91b38acb6f084ec5b91fbbd9c070227702b5a8f26a1c7: Status 404 returned error can't find the container with id 2c2c3032b7b24b1a47c91b38acb6f084ec5b91fbbd9c070227702b5a8f26a1c7 Apr 16 14:08:02.399916 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:02.399879 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" event={"ID":"3bb5710b-5550-4097-9967-7df2e8c5b723","Type":"ContainerStarted","Data":"2c2c3032b7b24b1a47c91b38acb6f084ec5b91fbbd9c070227702b5a8f26a1c7"} Apr 16 14:08:04.422792 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:04.422755 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:08:04.423079 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:04.422829 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:08:04.423079 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:04.422857 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:08:05.410348 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:05.410312 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" event={"ID":"3bb5710b-5550-4097-9967-7df2e8c5b723","Type":"ContainerStarted","Data":"ae2471546032d8ab8282cb011c6fe9192a9dc74055a27a6ddbab754966a6edf2"} Apr 16 14:08:05.436098 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:05.436050 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" podStartSLOduration=1.790427648 podStartE2EDuration="4.436036513s" podCreationTimestamp="2026-04-16 14:08:01 +0000 UTC" firstStartedPulling="2026-04-16 14:08:01.776859791 +0000 UTC m=+518.596225134" lastFinishedPulling="2026-04-16 14:08:04.422468654 +0000 UTC m=+521.241833999" observedRunningTime="2026-04-16 14:08:05.435271412 +0000 UTC m=+522.254636758" watchObservedRunningTime="2026-04-16 14:08:05.436036513 +0000 UTC m=+522.255401866" Apr 16 14:08:05.649920 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:05.649878 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:05.654419 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:05.654396 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:06.413580 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:06.413506 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:06.414519 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:06.414499 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4npv7" Apr 16 14:08:29.085295 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.085262 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-p4qgl"] Apr 16 14:08:29.096826 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.096797 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" Apr 16 14:08:29.101589 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.101567 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:08:29.101735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.101591 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:08:29.101795 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.101733 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-xc2gj\"" Apr 16 14:08:29.111338 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.111312 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-p4qgl"] Apr 16 14:08:29.121985 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.121955 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh2bs\" (UniqueName: \"kubernetes.io/projected/fe191a4e-dcc4-4632-b3df-df170611b410-kube-api-access-mh2bs\") pod \"authorino-operator-7587b89b76-p4qgl\" (UID: \"fe191a4e-dcc4-4632-b3df-df170611b410\") " pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" Apr 16 14:08:29.222773 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.222739 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh2bs\" (UniqueName: \"kubernetes.io/projected/fe191a4e-dcc4-4632-b3df-df170611b410-kube-api-access-mh2bs\") pod \"authorino-operator-7587b89b76-p4qgl\" (UID: \"fe191a4e-dcc4-4632-b3df-df170611b410\") " pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" Apr 16 14:08:29.244092 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.244059 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh2bs\" (UniqueName: \"kubernetes.io/projected/fe191a4e-dcc4-4632-b3df-df170611b410-kube-api-access-mh2bs\") pod \"authorino-operator-7587b89b76-p4qgl\" (UID: \"fe191a4e-dcc4-4632-b3df-df170611b410\") " pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" Apr 16 14:08:29.408194 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.408109 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" Apr 16 14:08:29.529057 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:29.529032 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-p4qgl"] Apr 16 14:08:29.531332 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:08:29.531304 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe191a4e_dcc4_4632_b3df_df170611b410.slice/crio-afc89c2d447e6dd85d6e00c320e2d91326a4911c468ca5bb06db38f2b41f5710 WatchSource:0}: Error finding container afc89c2d447e6dd85d6e00c320e2d91326a4911c468ca5bb06db38f2b41f5710: Status 404 returned error can't find the container with id afc89c2d447e6dd85d6e00c320e2d91326a4911c468ca5bb06db38f2b41f5710 Apr 16 14:08:30.493146 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:30.493108 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" event={"ID":"fe191a4e-dcc4-4632-b3df-df170611b410","Type":"ContainerStarted","Data":"afc89c2d447e6dd85d6e00c320e2d91326a4911c468ca5bb06db38f2b41f5710"} Apr 16 14:08:33.505159 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:33.505126 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" event={"ID":"fe191a4e-dcc4-4632-b3df-df170611b410","Type":"ContainerStarted","Data":"7d73ceb269ec96d0cf061b9f539cc6d6aea6ddba8a8783867b1dbf3d8988837a"} Apr 16 14:08:33.505560 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:33.505341 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" Apr 16 14:08:33.524253 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:33.524178 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" podStartSLOduration=1.362599468 podStartE2EDuration="4.524161635s" podCreationTimestamp="2026-04-16 14:08:29 +0000 UTC" firstStartedPulling="2026-04-16 14:08:29.533452665 +0000 UTC m=+546.352817998" lastFinishedPulling="2026-04-16 14:08:32.695014833 +0000 UTC m=+549.514380165" observedRunningTime="2026-04-16 14:08:33.522181864 +0000 UTC m=+550.341547216" watchObservedRunningTime="2026-04-16 14:08:33.524161635 +0000 UTC m=+550.343526988" Apr 16 14:08:44.511212 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:08:44.511166 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-p4qgl" Apr 16 14:09:15.055952 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.055910 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-w8m85"] Apr 16 14:09:15.058307 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.058291 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-w8m85" Apr 16 14:09:15.060731 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.060711 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-p96kd\"" Apr 16 14:09:15.063771 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.063748 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-w8m85"] Apr 16 14:09:15.225357 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.225324 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qfdm\" (UniqueName: \"kubernetes.io/projected/1496bb3c-ef39-4047-a017-88dd7918f7ab-kube-api-access-8qfdm\") pod \"authorino-79cbc94b89-w8m85\" (UID: \"1496bb3c-ef39-4047-a017-88dd7918f7ab\") " pod="kuadrant-system/authorino-79cbc94b89-w8m85" Apr 16 14:09:15.326632 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.326603 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qfdm\" (UniqueName: \"kubernetes.io/projected/1496bb3c-ef39-4047-a017-88dd7918f7ab-kube-api-access-8qfdm\") pod \"authorino-79cbc94b89-w8m85\" (UID: \"1496bb3c-ef39-4047-a017-88dd7918f7ab\") " pod="kuadrant-system/authorino-79cbc94b89-w8m85" Apr 16 14:09:15.334565 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.334542 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qfdm\" (UniqueName: \"kubernetes.io/projected/1496bb3c-ef39-4047-a017-88dd7918f7ab-kube-api-access-8qfdm\") pod \"authorino-79cbc94b89-w8m85\" (UID: \"1496bb3c-ef39-4047-a017-88dd7918f7ab\") " pod="kuadrant-system/authorino-79cbc94b89-w8m85" Apr 16 14:09:15.368470 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.368440 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-w8m85" Apr 16 14:09:15.489120 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.489044 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-w8m85"] Apr 16 14:09:15.490447 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:09:15.490417 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1496bb3c_ef39_4047_a017_88dd7918f7ab.slice/crio-0f634fb6c75e5e9b40392a31f75c17d5b30a969865c9f1ee57bff49a3258faf5 WatchSource:0}: Error finding container 0f634fb6c75e5e9b40392a31f75c17d5b30a969865c9f1ee57bff49a3258faf5: Status 404 returned error can't find the container with id 0f634fb6c75e5e9b40392a31f75c17d5b30a969865c9f1ee57bff49a3258faf5 Apr 16 14:09:15.656550 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:15.656466 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-w8m85" event={"ID":"1496bb3c-ef39-4047-a017-88dd7918f7ab","Type":"ContainerStarted","Data":"0f634fb6c75e5e9b40392a31f75c17d5b30a969865c9f1ee57bff49a3258faf5"} Apr 16 14:09:18.668676 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:18.668642 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-w8m85" event={"ID":"1496bb3c-ef39-4047-a017-88dd7918f7ab","Type":"ContainerStarted","Data":"022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1"} Apr 16 14:09:18.684639 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:18.684597 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-w8m85" podStartSLOduration=1.319347792 podStartE2EDuration="3.684581108s" podCreationTimestamp="2026-04-16 14:09:15 +0000 UTC" firstStartedPulling="2026-04-16 14:09:15.491859912 +0000 UTC m=+592.311225249" lastFinishedPulling="2026-04-16 14:09:17.857093234 +0000 UTC m=+594.676458565" observedRunningTime="2026-04-16 14:09:18.68377743 +0000 UTC m=+595.503142781" watchObservedRunningTime="2026-04-16 14:09:18.684581108 +0000 UTC m=+595.503946460" Apr 16 14:09:39.325116 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.325038 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-wc2qz"] Apr 16 14:09:39.327724 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.327703 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-wc2qz" Apr 16 14:09:39.330428 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.330397 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 14:09:39.335501 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.335479 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-wc2qz"] Apr 16 14:09:39.450617 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.450578 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gfqb\" (UniqueName: \"kubernetes.io/projected/d8bb742e-925e-4a94-8b9b-767c561d2f49-kube-api-access-6gfqb\") pod \"authorino-68bd676465-wc2qz\" (UID: \"d8bb742e-925e-4a94-8b9b-767c561d2f49\") " pod="kuadrant-system/authorino-68bd676465-wc2qz" Apr 16 14:09:39.450801 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.450651 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d8bb742e-925e-4a94-8b9b-767c561d2f49-tls-cert\") pod \"authorino-68bd676465-wc2qz\" (UID: \"d8bb742e-925e-4a94-8b9b-767c561d2f49\") " pod="kuadrant-system/authorino-68bd676465-wc2qz" Apr 16 14:09:39.551638 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.551595 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gfqb\" (UniqueName: \"kubernetes.io/projected/d8bb742e-925e-4a94-8b9b-767c561d2f49-kube-api-access-6gfqb\") pod \"authorino-68bd676465-wc2qz\" (UID: \"d8bb742e-925e-4a94-8b9b-767c561d2f49\") " pod="kuadrant-system/authorino-68bd676465-wc2qz" Apr 16 14:09:39.551813 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.551653 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d8bb742e-925e-4a94-8b9b-767c561d2f49-tls-cert\") pod \"authorino-68bd676465-wc2qz\" (UID: \"d8bb742e-925e-4a94-8b9b-767c561d2f49\") " pod="kuadrant-system/authorino-68bd676465-wc2qz" Apr 16 14:09:39.554128 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.554104 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d8bb742e-925e-4a94-8b9b-767c561d2f49-tls-cert\") pod \"authorino-68bd676465-wc2qz\" (UID: \"d8bb742e-925e-4a94-8b9b-767c561d2f49\") " pod="kuadrant-system/authorino-68bd676465-wc2qz" Apr 16 14:09:39.561506 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.561480 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gfqb\" (UniqueName: \"kubernetes.io/projected/d8bb742e-925e-4a94-8b9b-767c561d2f49-kube-api-access-6gfqb\") pod \"authorino-68bd676465-wc2qz\" (UID: \"d8bb742e-925e-4a94-8b9b-767c561d2f49\") " pod="kuadrant-system/authorino-68bd676465-wc2qz" Apr 16 14:09:39.637921 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.637832 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-wc2qz" Apr 16 14:09:39.761459 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:39.761436 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-wc2qz"] Apr 16 14:09:39.763219 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:09:39.763183 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8bb742e_925e_4a94_8b9b_767c561d2f49.slice/crio-151cfa7264d7f870627d11e184f1fc45d718736b13f9e8871bdff3bf89b283d4 WatchSource:0}: Error finding container 151cfa7264d7f870627d11e184f1fc45d718736b13f9e8871bdff3bf89b283d4: Status 404 returned error can't find the container with id 151cfa7264d7f870627d11e184f1fc45d718736b13f9e8871bdff3bf89b283d4 Apr 16 14:09:40.742635 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:40.742600 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-wc2qz" event={"ID":"d8bb742e-925e-4a94-8b9b-767c561d2f49","Type":"ContainerStarted","Data":"8fba24604f71c64fb34111ea57c019eb43068e72072c1b54f94e8b5fd5083372"} Apr 16 14:09:40.742635 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:40.742638 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-wc2qz" event={"ID":"d8bb742e-925e-4a94-8b9b-767c561d2f49","Type":"ContainerStarted","Data":"151cfa7264d7f870627d11e184f1fc45d718736b13f9e8871bdff3bf89b283d4"} Apr 16 14:09:40.757781 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:40.757732 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-wc2qz" podStartSLOduration=1.206270281 podStartE2EDuration="1.757716519s" podCreationTimestamp="2026-04-16 14:09:39 +0000 UTC" firstStartedPulling="2026-04-16 14:09:39.764458307 +0000 UTC m=+616.583823846" lastFinishedPulling="2026-04-16 14:09:40.315904754 +0000 UTC m=+617.135270084" observedRunningTime="2026-04-16 14:09:40.757586595 +0000 UTC m=+617.576951959" watchObservedRunningTime="2026-04-16 14:09:40.757716519 +0000 UTC m=+617.577081871" Apr 16 14:09:40.783156 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:40.783122 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-w8m85"] Apr 16 14:09:40.783394 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:40.783370 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-w8m85" podUID="1496bb3c-ef39-4047-a017-88dd7918f7ab" containerName="authorino" containerID="cri-o://022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1" gracePeriod=30 Apr 16 14:09:41.046266 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.046242 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-w8m85" Apr 16 14:09:41.165569 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.165538 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qfdm\" (UniqueName: \"kubernetes.io/projected/1496bb3c-ef39-4047-a017-88dd7918f7ab-kube-api-access-8qfdm\") pod \"1496bb3c-ef39-4047-a017-88dd7918f7ab\" (UID: \"1496bb3c-ef39-4047-a017-88dd7918f7ab\") " Apr 16 14:09:41.167670 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.167643 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1496bb3c-ef39-4047-a017-88dd7918f7ab-kube-api-access-8qfdm" (OuterVolumeSpecName: "kube-api-access-8qfdm") pod "1496bb3c-ef39-4047-a017-88dd7918f7ab" (UID: "1496bb3c-ef39-4047-a017-88dd7918f7ab"). InnerVolumeSpecName "kube-api-access-8qfdm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:41.266312 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.266277 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qfdm\" (UniqueName: \"kubernetes.io/projected/1496bb3c-ef39-4047-a017-88dd7918f7ab-kube-api-access-8qfdm\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:09:41.747338 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.747305 2564 generic.go:358] "Generic (PLEG): container finished" podID="1496bb3c-ef39-4047-a017-88dd7918f7ab" containerID="022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1" exitCode=0 Apr 16 14:09:41.747749 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.747351 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-w8m85" Apr 16 14:09:41.747749 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.747381 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-w8m85" event={"ID":"1496bb3c-ef39-4047-a017-88dd7918f7ab","Type":"ContainerDied","Data":"022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1"} Apr 16 14:09:41.747749 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.747415 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-w8m85" event={"ID":"1496bb3c-ef39-4047-a017-88dd7918f7ab","Type":"ContainerDied","Data":"0f634fb6c75e5e9b40392a31f75c17d5b30a969865c9f1ee57bff49a3258faf5"} Apr 16 14:09:41.747749 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.747430 2564 scope.go:117] "RemoveContainer" containerID="022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1" Apr 16 14:09:41.755739 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.755724 2564 scope.go:117] "RemoveContainer" containerID="022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1" Apr 16 14:09:41.755990 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:09:41.755970 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1\": container with ID starting with 022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1 not found: ID does not exist" containerID="022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1" Apr 16 14:09:41.756039 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.755998 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1"} err="failed to get container status \"022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1\": rpc error: code = NotFound desc = could not find container \"022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1\": container with ID starting with 022c928880133a11fdd191f0626a6a1aac953ce0b61f2dbf3b4831c931a0d0d1 not found: ID does not exist" Apr 16 14:09:41.768824 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.768803 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-w8m85"] Apr 16 14:09:41.775675 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:41.775651 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-w8m85"] Apr 16 14:09:43.774387 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:43.774352 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1496bb3c-ef39-4047-a017-88dd7918f7ab" path="/var/lib/kubelet/pods/1496bb3c-ef39-4047-a017-88dd7918f7ab/volumes" Apr 16 14:09:58.345016 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.344978 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-cr7n8"] Apr 16 14:09:58.345588 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.345546 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1496bb3c-ef39-4047-a017-88dd7918f7ab" containerName="authorino" Apr 16 14:09:58.345588 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.345567 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1496bb3c-ef39-4047-a017-88dd7918f7ab" containerName="authorino" Apr 16 14:09:58.345699 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.345661 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1496bb3c-ef39-4047-a017-88dd7918f7ab" containerName="authorino" Apr 16 14:09:58.351369 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.351347 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:09:58.354520 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.354498 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:09:58.354772 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.354749 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 14:09:58.355679 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.355659 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vk2km\"" Apr 16 14:09:58.355875 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.355856 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:09:58.357273 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.357106 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-cr7n8"] Apr 16 14:09:58.527952 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.527896 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/165d3b42-50b6-4867-9354-74220d038316-data\") pod \"seaweedfs-86cc847c5c-cr7n8\" (UID: \"165d3b42-50b6-4867-9354-74220d038316\") " pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:09:58.528121 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.528008 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl8hd\" (UniqueName: \"kubernetes.io/projected/165d3b42-50b6-4867-9354-74220d038316-kube-api-access-dl8hd\") pod \"seaweedfs-86cc847c5c-cr7n8\" (UID: \"165d3b42-50b6-4867-9354-74220d038316\") " pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:09:58.628735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.628634 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/165d3b42-50b6-4867-9354-74220d038316-data\") pod \"seaweedfs-86cc847c5c-cr7n8\" (UID: \"165d3b42-50b6-4867-9354-74220d038316\") " pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:09:58.628907 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.628760 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl8hd\" (UniqueName: \"kubernetes.io/projected/165d3b42-50b6-4867-9354-74220d038316-kube-api-access-dl8hd\") pod \"seaweedfs-86cc847c5c-cr7n8\" (UID: \"165d3b42-50b6-4867-9354-74220d038316\") " pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:09:58.629022 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.629001 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/165d3b42-50b6-4867-9354-74220d038316-data\") pod \"seaweedfs-86cc847c5c-cr7n8\" (UID: \"165d3b42-50b6-4867-9354-74220d038316\") " pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:09:58.637370 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.637341 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl8hd\" (UniqueName: \"kubernetes.io/projected/165d3b42-50b6-4867-9354-74220d038316-kube-api-access-dl8hd\") pod \"seaweedfs-86cc847c5c-cr7n8\" (UID: \"165d3b42-50b6-4867-9354-74220d038316\") " pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:09:58.663711 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.663682 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:09:58.785957 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.785919 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-cr7n8"] Apr 16 14:09:58.788157 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:09:58.788127 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165d3b42_50b6_4867_9354_74220d038316.slice/crio-716dd8cfd79472a9663b94515ca434f7fbb85231e939df8a923c8c7dfad52a32 WatchSource:0}: Error finding container 716dd8cfd79472a9663b94515ca434f7fbb85231e939df8a923c8c7dfad52a32: Status 404 returned error can't find the container with id 716dd8cfd79472a9663b94515ca434f7fbb85231e939df8a923c8c7dfad52a32 Apr 16 14:09:58.802882 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:09:58.802852 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-cr7n8" event={"ID":"165d3b42-50b6-4867-9354-74220d038316","Type":"ContainerStarted","Data":"716dd8cfd79472a9663b94515ca434f7fbb85231e939df8a923c8c7dfad52a32"} Apr 16 14:10:01.815978 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:10:01.815945 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-cr7n8" event={"ID":"165d3b42-50b6-4867-9354-74220d038316","Type":"ContainerStarted","Data":"317ecff35be7be523c7051624199bd2184984e3dd6737a7cacdf4a50ede1821f"} Apr 16 14:10:01.816505 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:10:01.816037 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:10:01.832438 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:10:01.832384 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-cr7n8" podStartSLOduration=1.348421889 podStartE2EDuration="3.832363079s" podCreationTimestamp="2026-04-16 14:09:58 +0000 UTC" firstStartedPulling="2026-04-16 14:09:58.789601298 +0000 UTC m=+635.608966634" lastFinishedPulling="2026-04-16 14:10:01.273542493 +0000 UTC m=+638.092907824" observedRunningTime="2026-04-16 14:10:01.831061737 +0000 UTC m=+638.650427093" watchObservedRunningTime="2026-04-16 14:10:01.832363079 +0000 UTC m=+638.651728432" Apr 16 14:10:07.820903 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:10:07.820873 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-cr7n8" Apr 16 14:11:08.877699 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.877610 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-mjzd8"] Apr 16 14:11:08.881140 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.881120 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:08.883684 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.883663 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-mrttz\"" Apr 16 14:11:08.884022 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.884005 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 14:11:08.889903 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.889880 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mjzd8"] Apr 16 14:11:08.895593 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.895571 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-t2r65"] Apr 16 14:11:08.898612 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.898596 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:08.901080 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.901061 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-9th9c\"" Apr 16 14:11:08.901330 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.901301 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 14:11:08.910651 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.910626 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-t2r65"] Apr 16 14:11:08.958638 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.958607 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5s4c\" (UniqueName: \"kubernetes.io/projected/564df1a3-ceb0-4eae-a10f-03ab54f88c68-kube-api-access-n5s4c\") pod \"odh-model-controller-696fc77849-t2r65\" (UID: \"564df1a3-ceb0-4eae-a10f-03ab54f88c68\") " pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:08.958815 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.958664 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/564df1a3-ceb0-4eae-a10f-03ab54f88c68-cert\") pod \"odh-model-controller-696fc77849-t2r65\" (UID: \"564df1a3-ceb0-4eae-a10f-03ab54f88c68\") " pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:08.958815 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.958725 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qkxl\" (UniqueName: \"kubernetes.io/projected/74fe564b-8b0e-4c62-9c21-414f01c31a52-kube-api-access-6qkxl\") pod \"model-serving-api-86f7b4b499-mjzd8\" (UID: \"74fe564b-8b0e-4c62-9c21-414f01c31a52\") " pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:08.958909 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:08.958847 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/74fe564b-8b0e-4c62-9c21-414f01c31a52-tls-certs\") pod \"model-serving-api-86f7b4b499-mjzd8\" (UID: \"74fe564b-8b0e-4c62-9c21-414f01c31a52\") " pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:09.060256 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.060227 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5s4c\" (UniqueName: \"kubernetes.io/projected/564df1a3-ceb0-4eae-a10f-03ab54f88c68-kube-api-access-n5s4c\") pod \"odh-model-controller-696fc77849-t2r65\" (UID: \"564df1a3-ceb0-4eae-a10f-03ab54f88c68\") " pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:09.060423 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.060280 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/564df1a3-ceb0-4eae-a10f-03ab54f88c68-cert\") pod \"odh-model-controller-696fc77849-t2r65\" (UID: \"564df1a3-ceb0-4eae-a10f-03ab54f88c68\") " pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:09.060423 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.060310 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qkxl\" (UniqueName: \"kubernetes.io/projected/74fe564b-8b0e-4c62-9c21-414f01c31a52-kube-api-access-6qkxl\") pod \"model-serving-api-86f7b4b499-mjzd8\" (UID: \"74fe564b-8b0e-4c62-9c21-414f01c31a52\") " pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:09.060423 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.060357 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/74fe564b-8b0e-4c62-9c21-414f01c31a52-tls-certs\") pod \"model-serving-api-86f7b4b499-mjzd8\" (UID: \"74fe564b-8b0e-4c62-9c21-414f01c31a52\") " pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:09.062740 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.062711 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/74fe564b-8b0e-4c62-9c21-414f01c31a52-tls-certs\") pod \"model-serving-api-86f7b4b499-mjzd8\" (UID: \"74fe564b-8b0e-4c62-9c21-414f01c31a52\") " pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:09.062870 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.062718 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/564df1a3-ceb0-4eae-a10f-03ab54f88c68-cert\") pod \"odh-model-controller-696fc77849-t2r65\" (UID: \"564df1a3-ceb0-4eae-a10f-03ab54f88c68\") " pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:09.068046 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.068022 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5s4c\" (UniqueName: \"kubernetes.io/projected/564df1a3-ceb0-4eae-a10f-03ab54f88c68-kube-api-access-n5s4c\") pod \"odh-model-controller-696fc77849-t2r65\" (UID: \"564df1a3-ceb0-4eae-a10f-03ab54f88c68\") " pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:09.068166 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.068067 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qkxl\" (UniqueName: \"kubernetes.io/projected/74fe564b-8b0e-4c62-9c21-414f01c31a52-kube-api-access-6qkxl\") pod \"model-serving-api-86f7b4b499-mjzd8\" (UID: \"74fe564b-8b0e-4c62-9c21-414f01c31a52\") " pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:09.193987 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.193884 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:09.211801 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.211772 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:09.330876 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.330847 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mjzd8"] Apr 16 14:11:09.333326 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:11:09.333291 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74fe564b_8b0e_4c62_9c21_414f01c31a52.slice/crio-f1be0a8ae12e57725bea9ccc2c19328482761f690f576cfd7f6c960694c0c65c WatchSource:0}: Error finding container f1be0a8ae12e57725bea9ccc2c19328482761f690f576cfd7f6c960694c0c65c: Status 404 returned error can't find the container with id f1be0a8ae12e57725bea9ccc2c19328482761f690f576cfd7f6c960694c0c65c Apr 16 14:11:09.348598 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:09.348568 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-t2r65"] Apr 16 14:11:09.351577 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:11:09.351551 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod564df1a3_ceb0_4eae_a10f_03ab54f88c68.slice/crio-d40687b932356e48a2178931e443d4bd5a715bf331a5780d2b230024e73e983b WatchSource:0}: Error finding container d40687b932356e48a2178931e443d4bd5a715bf331a5780d2b230024e73e983b: Status 404 returned error can't find the container with id d40687b932356e48a2178931e443d4bd5a715bf331a5780d2b230024e73e983b Apr 16 14:11:10.042070 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:10.042030 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-t2r65" event={"ID":"564df1a3-ceb0-4eae-a10f-03ab54f88c68","Type":"ContainerStarted","Data":"d40687b932356e48a2178931e443d4bd5a715bf331a5780d2b230024e73e983b"} Apr 16 14:11:10.043578 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:10.043547 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mjzd8" event={"ID":"74fe564b-8b0e-4c62-9c21-414f01c31a52","Type":"ContainerStarted","Data":"f1be0a8ae12e57725bea9ccc2c19328482761f690f576cfd7f6c960694c0c65c"} Apr 16 14:11:14.062120 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:14.062080 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-t2r65" event={"ID":"564df1a3-ceb0-4eae-a10f-03ab54f88c68","Type":"ContainerStarted","Data":"9f3b8122b282efb61ec74e322a4b78b822bf33d2975a72628841671e2480b1f7"} Apr 16 14:11:14.062634 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:14.062304 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:14.063678 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:14.063654 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mjzd8" event={"ID":"74fe564b-8b0e-4c62-9c21-414f01c31a52","Type":"ContainerStarted","Data":"add30377474d11b3133109ce6c374ddc099b6e89878ceb95b2f81ebd60e9e7da"} Apr 16 14:11:14.063808 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:14.063745 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:14.078046 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:14.077983 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-t2r65" podStartSLOduration=2.210223966 podStartE2EDuration="6.077963385s" podCreationTimestamp="2026-04-16 14:11:08 +0000 UTC" firstStartedPulling="2026-04-16 14:11:09.352938756 +0000 UTC m=+706.172304089" lastFinishedPulling="2026-04-16 14:11:13.220678175 +0000 UTC m=+710.040043508" observedRunningTime="2026-04-16 14:11:14.076903306 +0000 UTC m=+710.896268659" watchObservedRunningTime="2026-04-16 14:11:14.077963385 +0000 UTC m=+710.897328739" Apr 16 14:11:14.094444 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:14.094394 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-mjzd8" podStartSLOduration=2.212969774 podStartE2EDuration="6.094377258s" podCreationTimestamp="2026-04-16 14:11:08 +0000 UTC" firstStartedPulling="2026-04-16 14:11:09.335065971 +0000 UTC m=+706.154431305" lastFinishedPulling="2026-04-16 14:11:13.216473445 +0000 UTC m=+710.035838789" observedRunningTime="2026-04-16 14:11:14.092855547 +0000 UTC m=+710.912220901" watchObservedRunningTime="2026-04-16 14:11:14.094377258 +0000 UTC m=+710.913742610" Apr 16 14:11:25.070798 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:25.070764 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-t2r65" Apr 16 14:11:25.072610 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:25.072590 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-mjzd8" Apr 16 14:11:47.101185 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.101146 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j"] Apr 16 14:11:47.104363 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.104343 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.108545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.108518 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:11:47.109056 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.109035 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 14:11:47.109300 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.109069 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-9vzwr\"" Apr 16 14:11:47.109637 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.109540 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:11:47.124159 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.124130 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j"] Apr 16 14:11:47.209021 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.208990 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.209021 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.209030 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.209291 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.209066 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kgpj\" (UniqueName: \"kubernetes.io/projected/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-kube-api-access-7kgpj\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.209291 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.209120 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.209291 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.209161 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.209291 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.209267 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.209465 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.209324 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.209465 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.209351 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.209465 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.209425 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310197 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310162 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310409 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310264 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310409 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310306 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310409 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310332 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310409 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310381 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310641 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310428 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310641 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310459 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310641 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310504 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kgpj\" (UniqueName: \"kubernetes.io/projected/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-kube-api-access-7kgpj\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310641 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310533 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.310855 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310829 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.311037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.310917 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.311196 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.311174 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.311406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.311382 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.311757 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.311736 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.313154 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.313132 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.313271 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.313261 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.318750 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.318729 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.318961 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.318943 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kgpj\" (UniqueName: \"kubernetes.io/projected/d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324-kube-api-access-7kgpj\") pod \"router-gateway-1-openshift-default-6c59fbf55c-p5f4j\" (UID: \"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.418479 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.418387 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:47.559011 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.558979 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j"] Apr 16 14:11:47.561081 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:11:47.561053 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f3bbe7_bfa8_4d5e_8897_fc29dc36e324.slice/crio-3e3f0e247362d2ff5b205c9009cc584483e7a80643db346acb3f7ec8947d7200 WatchSource:0}: Error finding container 3e3f0e247362d2ff5b205c9009cc584483e7a80643db346acb3f7ec8947d7200: Status 404 returned error can't find the container with id 3e3f0e247362d2ff5b205c9009cc584483e7a80643db346acb3f7ec8947d7200 Apr 16 14:11:47.563049 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.563019 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:11:47.563149 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.563077 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:11:47.563149 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:47.563103 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:11:48.196872 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:48.196840 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" event={"ID":"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324","Type":"ContainerStarted","Data":"fd2be4a5a4d96e239edcb35eeb9c33a5368798073bf8fb3ab462871d788edf07"} Apr 16 14:11:48.196872 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:48.196876 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" event={"ID":"d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324","Type":"ContainerStarted","Data":"3e3f0e247362d2ff5b205c9009cc584483e7a80643db346acb3f7ec8947d7200"} Apr 16 14:11:48.219629 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:48.219580 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" podStartSLOduration=1.219563944 podStartE2EDuration="1.219563944s" podCreationTimestamp="2026-04-16 14:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:11:48.218076026 +0000 UTC m=+745.037441400" watchObservedRunningTime="2026-04-16 14:11:48.219563944 +0000 UTC m=+745.038929299" Apr 16 14:11:48.419179 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:48.419142 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:48.424425 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:48.424401 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:49.200333 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:49.200299 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:11:49.201639 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:11:49.201618 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-p5f4j" Apr 16 14:12:07.995453 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:07.995418 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg"] Apr 16 14:12:07.998450 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:07.998420 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.004084 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.004062 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:12:08.005536 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.005512 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 14:12:08.014604 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.014579 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg"] Apr 16 14:12:08.107562 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.107527 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdbs8\" (UniqueName: \"kubernetes.io/projected/64760a0f-f989-4333-b107-deef31591918-kube-api-access-kdbs8\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.107562 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.107563 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-model-cache\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.107771 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.107589 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64760a0f-f989-4333-b107-deef31591918-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.107771 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.107719 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-dshm\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.107771 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.107753 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-home\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.107877 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.107772 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.209290 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.209259 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdbs8\" (UniqueName: \"kubernetes.io/projected/64760a0f-f989-4333-b107-deef31591918-kube-api-access-kdbs8\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.209290 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.209293 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-model-cache\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.209495 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.209321 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64760a0f-f989-4333-b107-deef31591918-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.209495 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.209363 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-dshm\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.209495 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.209383 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-home\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.209495 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.209399 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.209775 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.209742 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-model-cache\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.209909 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.209812 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.209909 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.209851 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-home\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.211524 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.211496 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-dshm\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.211844 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.211823 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64760a0f-f989-4333-b107-deef31591918-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.219406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.219381 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdbs8\" (UniqueName: \"kubernetes.io/projected/64760a0f-f989-4333-b107-deef31591918-kube-api-access-kdbs8\") pod \"scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.309710 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.309620 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:08.441104 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.441069 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg"] Apr 16 14:12:08.444182 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:12:08.444150 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64760a0f_f989_4333_b107_deef31591918.slice/crio-2018df43d53455b7f7fd45b867bb88a7ff3213f55149937a53fac0b53cc18b80 WatchSource:0}: Error finding container 2018df43d53455b7f7fd45b867bb88a7ff3213f55149937a53fac0b53cc18b80: Status 404 returned error can't find the container with id 2018df43d53455b7f7fd45b867bb88a7ff3213f55149937a53fac0b53cc18b80 Apr 16 14:12:08.446240 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:08.446198 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:12:09.269156 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:09.269124 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" event={"ID":"64760a0f-f989-4333-b107-deef31591918","Type":"ContainerStarted","Data":"2018df43d53455b7f7fd45b867bb88a7ff3213f55149937a53fac0b53cc18b80"} Apr 16 14:12:13.311711 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:13.311672 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" event={"ID":"64760a0f-f989-4333-b107-deef31591918","Type":"ContainerStarted","Data":"a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec"} Apr 16 14:12:17.330457 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:17.330419 2564 generic.go:358] "Generic (PLEG): container finished" podID="64760a0f-f989-4333-b107-deef31591918" containerID="a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec" exitCode=0 Apr 16 14:12:17.330887 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:17.330494 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" event={"ID":"64760a0f-f989-4333-b107-deef31591918","Type":"ContainerDied","Data":"a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec"} Apr 16 14:12:19.339348 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:19.339312 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" event={"ID":"64760a0f-f989-4333-b107-deef31591918","Type":"ContainerStarted","Data":"223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d"} Apr 16 14:12:19.357809 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:19.357755 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" podStartSLOduration=2.177891556 podStartE2EDuration="12.357738216s" podCreationTimestamp="2026-04-16 14:12:07 +0000 UTC" firstStartedPulling="2026-04-16 14:12:08.446348179 +0000 UTC m=+765.265713513" lastFinishedPulling="2026-04-16 14:12:18.626194839 +0000 UTC m=+775.445560173" observedRunningTime="2026-04-16 14:12:19.356614604 +0000 UTC m=+776.175979957" watchObservedRunningTime="2026-04-16 14:12:19.357738216 +0000 UTC m=+776.177103568" Apr 16 14:12:28.310132 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:28.310095 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:28.310132 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:28.310139 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:28.322524 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:28.322493 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:12:28.378722 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:12:28.378691 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:13:11.351037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.351005 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc"] Apr 16 14:13:11.355001 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.354976 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.357945 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.357919 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-ks2v2\"" Apr 16 14:13:11.358087 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.358022 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 14:13:11.366331 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.366311 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc"] Apr 16 14:13:11.373058 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.373033 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.373166 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.373066 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.373166 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.373086 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.373166 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.373107 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.373166 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.373157 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.373338 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.373245 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq6c9\" (UniqueName: \"kubernetes.io/projected/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kube-api-access-xq6c9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474110 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474073 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xq6c9\" (UniqueName: \"kubernetes.io/projected/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kube-api-access-xq6c9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474344 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474184 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474344 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474243 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474344 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474269 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474344 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474302 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474595 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474355 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474751 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474725 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474751 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474741 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474868 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474783 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.474868 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.474803 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.476735 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.476714 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.483074 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.483051 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq6c9\" (UniqueName: \"kubernetes.io/projected/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kube-api-access-xq6c9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.666950 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.666852 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:11.795162 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:11.795138 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc"] Apr 16 14:13:11.796884 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:13:11.796856 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd44b2d_c9ed_4d61_bbda_2ac489e02d39.slice/crio-4f267a1e4ebbc2f705671b711bc1cd0174f2002c25dbd524ea31ff46a27dfb0b WatchSource:0}: Error finding container 4f267a1e4ebbc2f705671b711bc1cd0174f2002c25dbd524ea31ff46a27dfb0b: Status 404 returned error can't find the container with id 4f267a1e4ebbc2f705671b711bc1cd0174f2002c25dbd524ea31ff46a27dfb0b Apr 16 14:13:12.517759 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:12.517728 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" event={"ID":"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39","Type":"ContainerStarted","Data":"b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28"} Apr 16 14:13:12.517759 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:12.517763 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" event={"ID":"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39","Type":"ContainerStarted","Data":"4f267a1e4ebbc2f705671b711bc1cd0174f2002c25dbd524ea31ff46a27dfb0b"} Apr 16 14:13:12.818603 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:12.818509 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg"] Apr 16 14:13:12.818903 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:12.818855 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" podUID="64760a0f-f989-4333-b107-deef31591918" containerName="main" containerID="cri-o://223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d" gracePeriod=30 Apr 16 14:13:13.071907 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.071874 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:13:13.088952 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.088919 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64760a0f-f989-4333-b107-deef31591918-tls-certs\") pod \"64760a0f-f989-4333-b107-deef31591918\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " Apr 16 14:13:13.089146 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.088960 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-dshm\") pod \"64760a0f-f989-4333-b107-deef31591918\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " Apr 16 14:13:13.089146 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.089049 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdbs8\" (UniqueName: \"kubernetes.io/projected/64760a0f-f989-4333-b107-deef31591918-kube-api-access-kdbs8\") pod \"64760a0f-f989-4333-b107-deef31591918\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " Apr 16 14:13:13.089146 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.089085 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-model-cache\") pod \"64760a0f-f989-4333-b107-deef31591918\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " Apr 16 14:13:13.089146 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.089128 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-kserve-provision-location\") pod \"64760a0f-f989-4333-b107-deef31591918\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " Apr 16 14:13:13.089406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.089166 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-home\") pod \"64760a0f-f989-4333-b107-deef31591918\" (UID: \"64760a0f-f989-4333-b107-deef31591918\") " Apr 16 14:13:13.089483 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.089456 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-model-cache" (OuterVolumeSpecName: "model-cache") pod "64760a0f-f989-4333-b107-deef31591918" (UID: "64760a0f-f989-4333-b107-deef31591918"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:13.089604 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.089577 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-home" (OuterVolumeSpecName: "home") pod "64760a0f-f989-4333-b107-deef31591918" (UID: "64760a0f-f989-4333-b107-deef31591918"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:13.091824 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.091788 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64760a0f-f989-4333-b107-deef31591918-kube-api-access-kdbs8" (OuterVolumeSpecName: "kube-api-access-kdbs8") pod "64760a0f-f989-4333-b107-deef31591918" (UID: "64760a0f-f989-4333-b107-deef31591918"). InnerVolumeSpecName "kube-api-access-kdbs8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:13:13.091944 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.091819 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64760a0f-f989-4333-b107-deef31591918-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "64760a0f-f989-4333-b107-deef31591918" (UID: "64760a0f-f989-4333-b107-deef31591918"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:13:13.092050 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.091862 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-dshm" (OuterVolumeSpecName: "dshm") pod "64760a0f-f989-4333-b107-deef31591918" (UID: "64760a0f-f989-4333-b107-deef31591918"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:13.144943 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.144897 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "64760a0f-f989-4333-b107-deef31591918" (UID: "64760a0f-f989-4333-b107-deef31591918"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:13.190096 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.190056 2564 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-home\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:13:13.190096 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.190090 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64760a0f-f989-4333-b107-deef31591918-tls-certs\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:13:13.190096 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.190101 2564 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-dshm\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:13:13.190366 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.190112 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdbs8\" (UniqueName: \"kubernetes.io/projected/64760a0f-f989-4333-b107-deef31591918-kube-api-access-kdbs8\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:13:13.190366 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.190121 2564 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-model-cache\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:13:13.190366 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.190129 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64760a0f-f989-4333-b107-deef31591918-kserve-provision-location\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:13:13.522128 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.522085 2564 generic.go:358] "Generic (PLEG): container finished" podID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerID="b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28" exitCode=0 Apr 16 14:13:13.522584 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.522171 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" event={"ID":"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39","Type":"ContainerDied","Data":"b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28"} Apr 16 14:13:13.523550 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.523527 2564 generic.go:358] "Generic (PLEG): container finished" podID="64760a0f-f989-4333-b107-deef31591918" containerID="223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d" exitCode=0 Apr 16 14:13:13.523658 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.523584 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" Apr 16 14:13:13.523658 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.523635 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" event={"ID":"64760a0f-f989-4333-b107-deef31591918","Type":"ContainerDied","Data":"223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d"} Apr 16 14:13:13.523783 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.523665 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg" event={"ID":"64760a0f-f989-4333-b107-deef31591918","Type":"ContainerDied","Data":"2018df43d53455b7f7fd45b867bb88a7ff3213f55149937a53fac0b53cc18b80"} Apr 16 14:13:13.523783 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.523682 2564 scope.go:117] "RemoveContainer" containerID="223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d" Apr 16 14:13:13.532649 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.532633 2564 scope.go:117] "RemoveContainer" containerID="a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec" Apr 16 14:13:13.550010 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.549993 2564 scope.go:117] "RemoveContainer" containerID="223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d" Apr 16 14:13:13.550290 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:13:13.550273 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d\": container with ID starting with 223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d not found: ID does not exist" containerID="223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d" Apr 16 14:13:13.550350 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.550298 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d"} err="failed to get container status \"223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d\": rpc error: code = NotFound desc = could not find container \"223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d\": container with ID starting with 223ed5116856a055a23e11b5e4a1da0f86c7193f2867411ee7b057853e144c6d not found: ID does not exist" Apr 16 14:13:13.550350 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.550316 2564 scope.go:117] "RemoveContainer" containerID="a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec" Apr 16 14:13:13.550577 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:13:13.550559 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec\": container with ID starting with a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec not found: ID does not exist" containerID="a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec" Apr 16 14:13:13.550621 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.550584 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec"} err="failed to get container status \"a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec\": rpc error: code = NotFound desc = could not find container \"a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec\": container with ID starting with a4122b173d725036d234f3460a88265efe3cbfbfac5503e5589509948fb083ec not found: ID does not exist" Apr 16 14:13:13.559670 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.559644 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg"] Apr 16 14:13:13.564846 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.564820 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6c6f5c6d89-l22fg"] Apr 16 14:13:13.774859 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:13.774759 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64760a0f-f989-4333-b107-deef31591918" path="/var/lib/kubelet/pods/64760a0f-f989-4333-b107-deef31591918/volumes" Apr 16 14:13:15.534155 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:15.534116 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" event={"ID":"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39","Type":"ContainerStarted","Data":"4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3"} Apr 16 14:13:44.652581 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:44.652545 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" event={"ID":"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39","Type":"ContainerStarted","Data":"26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f"} Apr 16 14:13:44.653042 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:44.652733 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:44.655329 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:44.655308 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:44.673419 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:44.673359 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" podStartSLOduration=3.552044445 podStartE2EDuration="33.673342123s" podCreationTimestamp="2026-04-16 14:13:11 +0000 UTC" firstStartedPulling="2026-04-16 14:13:13.523354525 +0000 UTC m=+830.342719858" lastFinishedPulling="2026-04-16 14:13:43.644652194 +0000 UTC m=+860.464017536" observedRunningTime="2026-04-16 14:13:44.671594512 +0000 UTC m=+861.490959864" watchObservedRunningTime="2026-04-16 14:13:44.673342123 +0000 UTC m=+861.492707481" Apr 16 14:13:51.667989 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:51.667951 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:51.668411 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:51.668100 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:51.669719 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:51.669694 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:13:51.679336 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:13:51.679313 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:14:11.734266 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.734174 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv"] Apr 16 14:14:11.734764 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.734744 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64760a0f-f989-4333-b107-deef31591918" containerName="storage-initializer" Apr 16 14:14:11.734858 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.734766 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="64760a0f-f989-4333-b107-deef31591918" containerName="storage-initializer" Apr 16 14:14:11.734858 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.734789 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64760a0f-f989-4333-b107-deef31591918" containerName="main" Apr 16 14:14:11.734858 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.734799 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="64760a0f-f989-4333-b107-deef31591918" containerName="main" Apr 16 14:14:11.735001 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.734931 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="64760a0f-f989-4333-b107-deef31591918" containerName="main" Apr 16 14:14:11.738110 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.738091 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.740905 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.740887 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 14:14:11.748198 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.748175 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv"] Apr 16 14:14:11.842218 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.842170 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-home\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.842334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.842245 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/048227d5-ee20-47eb-bfed-417e81da831c-tls-certs\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.842334 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.842307 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nthq9\" (UniqueName: \"kubernetes.io/projected/048227d5-ee20-47eb-bfed-417e81da831c-kube-api-access-nthq9\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.842409 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.842393 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-model-cache\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.842449 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.842426 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.842482 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.842455 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-dshm\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.943518 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.943476 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.943714 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.943533 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-dshm\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.943714 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.943595 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-home\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.943714 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.943624 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/048227d5-ee20-47eb-bfed-417e81da831c-tls-certs\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.943714 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.943674 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nthq9\" (UniqueName: \"kubernetes.io/projected/048227d5-ee20-47eb-bfed-417e81da831c-kube-api-access-nthq9\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.943948 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.943754 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-model-cache\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.943948 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.943899 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.944059 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.943987 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-home\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.944118 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.944075 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-model-cache\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.945970 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.945950 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-dshm\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.946114 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.946087 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/048227d5-ee20-47eb-bfed-417e81da831c-tls-certs\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:11.951482 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:11.951462 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nthq9\" (UniqueName: \"kubernetes.io/projected/048227d5-ee20-47eb-bfed-417e81da831c-kube-api-access-nthq9\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-rd4hv\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:12.073106 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:12.073069 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:12.196011 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:12.195888 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv"] Apr 16 14:14:12.198568 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:14:12.198539 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod048227d5_ee20_47eb_bfed_417e81da831c.slice/crio-356a3c4e0dd541cd96f1dce2304293b7654372278f87a3cc119dad8c3473f9ee WatchSource:0}: Error finding container 356a3c4e0dd541cd96f1dce2304293b7654372278f87a3cc119dad8c3473f9ee: Status 404 returned error can't find the container with id 356a3c4e0dd541cd96f1dce2304293b7654372278f87a3cc119dad8c3473f9ee Apr 16 14:14:12.748716 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:12.748681 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" event={"ID":"048227d5-ee20-47eb-bfed-417e81da831c","Type":"ContainerStarted","Data":"31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969"} Apr 16 14:14:12.748716 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:12.748719 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" event={"ID":"048227d5-ee20-47eb-bfed-417e81da831c","Type":"ContainerStarted","Data":"356a3c4e0dd541cd96f1dce2304293b7654372278f87a3cc119dad8c3473f9ee"} Apr 16 14:14:16.764602 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:16.764515 2564 generic.go:358] "Generic (PLEG): container finished" podID="048227d5-ee20-47eb-bfed-417e81da831c" containerID="31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969" exitCode=0 Apr 16 14:14:16.764602 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:16.764574 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" event={"ID":"048227d5-ee20-47eb-bfed-417e81da831c","Type":"ContainerDied","Data":"31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969"} Apr 16 14:14:17.773782 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:17.773749 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" event={"ID":"048227d5-ee20-47eb-bfed-417e81da831c","Type":"ContainerStarted","Data":"b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f"} Apr 16 14:14:17.792832 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:17.792787 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" podStartSLOduration=6.792772102 podStartE2EDuration="6.792772102s" podCreationTimestamp="2026-04-16 14:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:14:17.789949466 +0000 UTC m=+894.609314817" watchObservedRunningTime="2026-04-16 14:14:17.792772102 +0000 UTC m=+894.612137454" Apr 16 14:14:22.074069 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:22.074033 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:22.074069 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:22.074074 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:22.086494 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:22.086464 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:22.798467 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:22.798443 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:44.840058 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:44.840023 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv"] Apr 16 14:14:44.840993 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:44.840938 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" podUID="048227d5-ee20-47eb-bfed-417e81da831c" containerName="main" containerID="cri-o://b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f" gracePeriod=30 Apr 16 14:14:45.096153 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.096094 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:45.157716 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.157687 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-dshm\") pod \"048227d5-ee20-47eb-bfed-417e81da831c\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " Apr 16 14:14:45.157875 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.157738 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-kserve-provision-location\") pod \"048227d5-ee20-47eb-bfed-417e81da831c\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " Apr 16 14:14:45.157875 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.157764 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-model-cache\") pod \"048227d5-ee20-47eb-bfed-417e81da831c\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " Apr 16 14:14:45.157875 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.157808 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nthq9\" (UniqueName: \"kubernetes.io/projected/048227d5-ee20-47eb-bfed-417e81da831c-kube-api-access-nthq9\") pod \"048227d5-ee20-47eb-bfed-417e81da831c\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " Apr 16 14:14:45.157875 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.157846 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/048227d5-ee20-47eb-bfed-417e81da831c-tls-certs\") pod \"048227d5-ee20-47eb-bfed-417e81da831c\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " Apr 16 14:14:45.158098 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.157877 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-home\") pod \"048227d5-ee20-47eb-bfed-417e81da831c\" (UID: \"048227d5-ee20-47eb-bfed-417e81da831c\") " Apr 16 14:14:45.158098 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.158075 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-model-cache" (OuterVolumeSpecName: "model-cache") pod "048227d5-ee20-47eb-bfed-417e81da831c" (UID: "048227d5-ee20-47eb-bfed-417e81da831c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:45.158226 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.158167 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-home" (OuterVolumeSpecName: "home") pod "048227d5-ee20-47eb-bfed-417e81da831c" (UID: "048227d5-ee20-47eb-bfed-417e81da831c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:45.158372 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.158345 2564 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-model-cache\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.158372 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.158372 2564 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-home\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.159848 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.159826 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-dshm" (OuterVolumeSpecName: "dshm") pod "048227d5-ee20-47eb-bfed-417e81da831c" (UID: "048227d5-ee20-47eb-bfed-417e81da831c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:45.159929 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.159888 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/048227d5-ee20-47eb-bfed-417e81da831c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "048227d5-ee20-47eb-bfed-417e81da831c" (UID: "048227d5-ee20-47eb-bfed-417e81da831c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:14:45.160079 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.160057 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/048227d5-ee20-47eb-bfed-417e81da831c-kube-api-access-nthq9" (OuterVolumeSpecName: "kube-api-access-nthq9") pod "048227d5-ee20-47eb-bfed-417e81da831c" (UID: "048227d5-ee20-47eb-bfed-417e81da831c"). InnerVolumeSpecName "kube-api-access-nthq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:14:45.212646 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.212604 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "048227d5-ee20-47eb-bfed-417e81da831c" (UID: "048227d5-ee20-47eb-bfed-417e81da831c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:45.259629 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.259596 2564 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-dshm\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.259629 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.259622 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/048227d5-ee20-47eb-bfed-417e81da831c-kserve-provision-location\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.259629 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.259632 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nthq9\" (UniqueName: \"kubernetes.io/projected/048227d5-ee20-47eb-bfed-417e81da831c-kube-api-access-nthq9\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.259629 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.259641 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/048227d5-ee20-47eb-bfed-417e81da831c-tls-certs\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.884391 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.884354 2564 generic.go:358] "Generic (PLEG): container finished" podID="048227d5-ee20-47eb-bfed-417e81da831c" containerID="b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f" exitCode=0 Apr 16 14:14:45.884759 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.884440 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" event={"ID":"048227d5-ee20-47eb-bfed-417e81da831c","Type":"ContainerDied","Data":"b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f"} Apr 16 14:14:45.884759 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.884481 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" event={"ID":"048227d5-ee20-47eb-bfed-417e81da831c","Type":"ContainerDied","Data":"356a3c4e0dd541cd96f1dce2304293b7654372278f87a3cc119dad8c3473f9ee"} Apr 16 14:14:45.884759 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.884450 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv" Apr 16 14:14:45.884759 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.884496 2564 scope.go:117] "RemoveContainer" containerID="b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f" Apr 16 14:14:45.893063 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.893047 2564 scope.go:117] "RemoveContainer" containerID="31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969" Apr 16 14:14:45.904168 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.904135 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv"] Apr 16 14:14:45.907015 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.906991 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-rd4hv"] Apr 16 14:14:45.955477 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.955458 2564 scope.go:117] "RemoveContainer" containerID="b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f" Apr 16 14:14:45.955825 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:14:45.955802 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f\": container with ID starting with b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f not found: ID does not exist" containerID="b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f" Apr 16 14:14:45.955876 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.955835 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f"} err="failed to get container status \"b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f\": rpc error: code = NotFound desc = could not find container \"b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f\": container with ID starting with b7feed98344299d46682b42ee1f21f62b239ce9d70d01bfec60663d4577d547f not found: ID does not exist" Apr 16 14:14:45.955876 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.955856 2564 scope.go:117] "RemoveContainer" containerID="31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969" Apr 16 14:14:45.956118 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:14:45.956099 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969\": container with ID starting with 31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969 not found: ID does not exist" containerID="31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969" Apr 16 14:14:45.956180 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:45.956129 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969"} err="failed to get container status \"31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969\": rpc error: code = NotFound desc = could not find container \"31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969\": container with ID starting with 31d7ff11a9b66d8f02c223bf03109cd0b7880608639c9228ae181d8c0e1a2969 not found: ID does not exist" Apr 16 14:14:47.774095 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:14:47.774062 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="048227d5-ee20-47eb-bfed-417e81da831c" path="/var/lib/kubelet/pods/048227d5-ee20-47eb-bfed-417e81da831c/volumes" Apr 16 14:15:58.606407 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:58.606324 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc"] Apr 16 14:15:58.607161 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:58.606653 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="main" containerID="cri-o://4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3" gracePeriod=30 Apr 16 14:15:58.607161 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:58.606696 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="tokenizer" containerID="cri-o://26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f" gracePeriod=30 Apr 16 14:15:59.140625 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.140537 2564 generic.go:358] "Generic (PLEG): container finished" podID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerID="4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3" exitCode=0 Apr 16 14:15:59.140795 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.140584 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" event={"ID":"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39","Type":"ContainerDied","Data":"4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3"} Apr 16 14:15:59.851070 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.851048 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:15:59.875488 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.875456 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-tmp\") pod \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " Apr 16 14:15:59.875656 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.875522 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-uds\") pod \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " Apr 16 14:15:59.875656 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.875563 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq6c9\" (UniqueName: \"kubernetes.io/projected/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kube-api-access-xq6c9\") pod \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " Apr 16 14:15:59.875656 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.875608 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kserve-provision-location\") pod \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " Apr 16 14:15:59.875656 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.875651 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tls-certs\") pod \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " Apr 16 14:15:59.875870 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.875681 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-cache\") pod \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\" (UID: \"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39\") " Apr 16 14:15:59.875870 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.875818 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" (UID: "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:59.876047 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.876026 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-tmp\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:15:59.876141 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.876122 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" (UID: "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:59.876218 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.876135 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" (UID: "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:59.876722 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.876696 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" (UID: "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:59.877844 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.877823 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" (UID: "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:15:59.878307 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.878286 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kube-api-access-xq6c9" (OuterVolumeSpecName: "kube-api-access-xq6c9") pod "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" (UID: "6bd44b2d-c9ed-4d61-bbda-2ac489e02d39"). InnerVolumeSpecName "kube-api-access-xq6c9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:15:59.977545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.977462 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-uds\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:15:59.977545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.977491 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xq6c9\" (UniqueName: \"kubernetes.io/projected/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kube-api-access-xq6c9\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:15:59.977545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.977501 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-kserve-provision-location\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:15:59.977545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.977512 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tls-certs\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:15:59.977545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:15:59.977521 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39-tokenizer-cache\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:16:00.145504 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.145463 2564 generic.go:358] "Generic (PLEG): container finished" podID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerID="26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f" exitCode=0 Apr 16 14:16:00.145694 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.145534 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" Apr 16 14:16:00.145694 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.145551 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" event={"ID":"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39","Type":"ContainerDied","Data":"26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f"} Apr 16 14:16:00.145694 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.145596 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc" event={"ID":"6bd44b2d-c9ed-4d61-bbda-2ac489e02d39","Type":"ContainerDied","Data":"4f267a1e4ebbc2f705671b711bc1cd0174f2002c25dbd524ea31ff46a27dfb0b"} Apr 16 14:16:00.145694 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.145616 2564 scope.go:117] "RemoveContainer" containerID="26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f" Apr 16 14:16:00.154961 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.154945 2564 scope.go:117] "RemoveContainer" containerID="4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3" Apr 16 14:16:00.164671 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.164653 2564 scope.go:117] "RemoveContainer" containerID="b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28" Apr 16 14:16:00.169721 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.169698 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc"] Apr 16 14:16:00.172820 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.172799 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche5bvhc"] Apr 16 14:16:00.173062 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.173045 2564 scope.go:117] "RemoveContainer" containerID="26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f" Apr 16 14:16:00.173343 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:16:00.173325 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f\": container with ID starting with 26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f not found: ID does not exist" containerID="26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f" Apr 16 14:16:00.173409 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.173350 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f"} err="failed to get container status \"26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f\": rpc error: code = NotFound desc = could not find container \"26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f\": container with ID starting with 26640034850d50ac0c20fab6c1f6063de669d7c053c0cef45e8a50ff8ba5fe2f not found: ID does not exist" Apr 16 14:16:00.173409 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.173368 2564 scope.go:117] "RemoveContainer" containerID="4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3" Apr 16 14:16:00.179104 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:16:00.174038 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3\": container with ID starting with 4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3 not found: ID does not exist" containerID="4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3" Apr 16 14:16:00.179104 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.174144 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3"} err="failed to get container status \"4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3\": rpc error: code = NotFound desc = could not find container \"4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3\": container with ID starting with 4677be72cbff748ec93c7a9c77c85deaec90949c14e6a1ff01060d8001cd60b3 not found: ID does not exist" Apr 16 14:16:00.179104 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.174197 2564 scope.go:117] "RemoveContainer" containerID="b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28" Apr 16 14:16:00.179104 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:16:00.175539 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28\": container with ID starting with b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28 not found: ID does not exist" containerID="b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28" Apr 16 14:16:00.179104 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:00.175583 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28"} err="failed to get container status \"b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28\": rpc error: code = NotFound desc = could not find container \"b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28\": container with ID starting with b64da00f3bbe0db5841b9fa1302a9dcbfe3ce91427fbaf721d7ef4f81643aa28 not found: ID does not exist" Apr 16 14:16:01.774659 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:01.774623 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" path="/var/lib/kubelet/pods/6bd44b2d-c9ed-4d61-bbda-2ac489e02d39/volumes" Apr 16 14:16:10.518974 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.518940 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm"] Apr 16 14:16:10.519367 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519350 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="main" Apr 16 14:16:10.519367 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519363 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="main" Apr 16 14:16:10.519437 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519380 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="048227d5-ee20-47eb-bfed-417e81da831c" containerName="main" Apr 16 14:16:10.519437 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519386 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="048227d5-ee20-47eb-bfed-417e81da831c" containerName="main" Apr 16 14:16:10.519437 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519397 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="048227d5-ee20-47eb-bfed-417e81da831c" containerName="storage-initializer" Apr 16 14:16:10.519437 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519403 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="048227d5-ee20-47eb-bfed-417e81da831c" containerName="storage-initializer" Apr 16 14:16:10.519437 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519411 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="storage-initializer" Apr 16 14:16:10.519437 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519416 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="storage-initializer" Apr 16 14:16:10.519437 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519425 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="tokenizer" Apr 16 14:16:10.519437 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519430 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="tokenizer" Apr 16 14:16:10.519667 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519487 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="main" Apr 16 14:16:10.519667 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519496 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="048227d5-ee20-47eb-bfed-417e81da831c" containerName="main" Apr 16 14:16:10.519667 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.519505 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bd44b2d-c9ed-4d61-bbda-2ac489e02d39" containerName="tokenizer" Apr 16 14:16:10.524440 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.524421 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.527110 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.527074 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-fwxlv\"" Apr 16 14:16:10.528110 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.528089 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:16:10.528241 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.528108 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 14:16:10.535364 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.535338 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm"] Apr 16 14:16:10.570724 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.570690 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.570890 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.570802 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-777m9\" (UniqueName: \"kubernetes.io/projected/89cfb8a3-3641-49de-a917-8c133c6b82e2-kube-api-access-777m9\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.570890 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.570861 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.570982 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.570894 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89cfb8a3-3641-49de-a917-8c133c6b82e2-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.570982 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.570925 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.570982 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.570953 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672117 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672081 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672316 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672140 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-777m9\" (UniqueName: \"kubernetes.io/projected/89cfb8a3-3641-49de-a917-8c133c6b82e2-kube-api-access-777m9\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672316 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672177 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672316 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672228 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89cfb8a3-3641-49de-a917-8c133c6b82e2-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672456 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672417 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672525 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672472 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672525 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672487 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672525 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672512 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672719 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672699 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.672793 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.672772 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.674613 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.674597 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89cfb8a3-3641-49de-a917-8c133c6b82e2-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.679499 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.679478 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-777m9\" (UniqueName: \"kubernetes.io/projected/89cfb8a3-3641-49de-a917-8c133c6b82e2-kube-api-access-777m9\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.836464 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.836425 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:10.973436 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:10.973414 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm"] Apr 16 14:16:10.975646 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:16:10.975612 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89cfb8a3_3641_49de_a917_8c133c6b82e2.slice/crio-b8b66bfb31a745b1af5821c67a578d856945a3a750c7013bd338da0d8d83d9c0 WatchSource:0}: Error finding container b8b66bfb31a745b1af5821c67a578d856945a3a750c7013bd338da0d8d83d9c0: Status 404 returned error can't find the container with id b8b66bfb31a745b1af5821c67a578d856945a3a750c7013bd338da0d8d83d9c0 Apr 16 14:16:11.187468 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:11.187382 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" event={"ID":"89cfb8a3-3641-49de-a917-8c133c6b82e2","Type":"ContainerStarted","Data":"02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317"} Apr 16 14:16:11.187468 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:11.187422 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" event={"ID":"89cfb8a3-3641-49de-a917-8c133c6b82e2","Type":"ContainerStarted","Data":"b8b66bfb31a745b1af5821c67a578d856945a3a750c7013bd338da0d8d83d9c0"} Apr 16 14:16:12.192497 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:12.192460 2564 generic.go:358] "Generic (PLEG): container finished" podID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerID="02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317" exitCode=0 Apr 16 14:16:12.192969 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:12.192537 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" event={"ID":"89cfb8a3-3641-49de-a917-8c133c6b82e2","Type":"ContainerDied","Data":"02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317"} Apr 16 14:16:13.197873 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:13.197834 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" event={"ID":"89cfb8a3-3641-49de-a917-8c133c6b82e2","Type":"ContainerStarted","Data":"3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a"} Apr 16 14:16:13.197873 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:13.197878 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" event={"ID":"89cfb8a3-3641-49de-a917-8c133c6b82e2","Type":"ContainerStarted","Data":"7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226"} Apr 16 14:16:13.198306 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:13.197970 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:13.220607 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:13.220547 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" podStartSLOduration=3.220527592 podStartE2EDuration="3.220527592s" podCreationTimestamp="2026-04-16 14:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:16:13.217486492 +0000 UTC m=+1010.036851845" watchObservedRunningTime="2026-04-16 14:16:13.220527592 +0000 UTC m=+1010.039892948" Apr 16 14:16:20.837037 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:20.836999 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:20.837509 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:20.837165 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:20.839886 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:20.839864 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:21.228639 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:21.228562 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:16:43.235831 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:16:43.235800 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:18:10.622215 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:10.622162 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm"] Apr 16 14:18:10.622763 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:10.622600 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="main" containerID="cri-o://7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226" gracePeriod=30 Apr 16 14:18:10.622763 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:10.622642 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="tokenizer" containerID="cri-o://3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a" gracePeriod=30 Apr 16 14:18:11.227664 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.227624 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.38:8082/healthz\": dial tcp 10.133.0.38:8082: connect: connection refused" Apr 16 14:18:11.628189 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.628152 2564 generic.go:358] "Generic (PLEG): container finished" podID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerID="7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226" exitCode=0 Apr 16 14:18:11.628672 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.628231 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" event={"ID":"89cfb8a3-3641-49de-a917-8c133c6b82e2","Type":"ContainerDied","Data":"7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226"} Apr 16 14:18:11.877000 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.876978 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:18:11.907251 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907164 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-kserve-provision-location\") pod \"89cfb8a3-3641-49de-a917-8c133c6b82e2\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " Apr 16 14:18:11.907251 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907238 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-uds\") pod \"89cfb8a3-3641-49de-a917-8c133c6b82e2\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " Apr 16 14:18:11.907433 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907330 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89cfb8a3-3641-49de-a917-8c133c6b82e2-tls-certs\") pod \"89cfb8a3-3641-49de-a917-8c133c6b82e2\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " Apr 16 14:18:11.907433 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907409 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-777m9\" (UniqueName: \"kubernetes.io/projected/89cfb8a3-3641-49de-a917-8c133c6b82e2-kube-api-access-777m9\") pod \"89cfb8a3-3641-49de-a917-8c133c6b82e2\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " Apr 16 14:18:11.907516 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907433 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-tmp\") pod \"89cfb8a3-3641-49de-a917-8c133c6b82e2\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " Apr 16 14:18:11.907516 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907476 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-cache\") pod \"89cfb8a3-3641-49de-a917-8c133c6b82e2\" (UID: \"89cfb8a3-3641-49de-a917-8c133c6b82e2\") " Apr 16 14:18:11.907516 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907472 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "89cfb8a3-3641-49de-a917-8c133c6b82e2" (UID: "89cfb8a3-3641-49de-a917-8c133c6b82e2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:11.907859 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907765 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "89cfb8a3-3641-49de-a917-8c133c6b82e2" (UID: "89cfb8a3-3641-49de-a917-8c133c6b82e2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:11.907859 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907781 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-uds\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:18:11.907859 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.907822 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "89cfb8a3-3641-49de-a917-8c133c6b82e2" (UID: "89cfb8a3-3641-49de-a917-8c133c6b82e2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:11.908147 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.908124 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "89cfb8a3-3641-49de-a917-8c133c6b82e2" (UID: "89cfb8a3-3641-49de-a917-8c133c6b82e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:11.909465 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.909441 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89cfb8a3-3641-49de-a917-8c133c6b82e2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "89cfb8a3-3641-49de-a917-8c133c6b82e2" (UID: "89cfb8a3-3641-49de-a917-8c133c6b82e2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:18:11.909546 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:11.909491 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89cfb8a3-3641-49de-a917-8c133c6b82e2-kube-api-access-777m9" (OuterVolumeSpecName: "kube-api-access-777m9") pod "89cfb8a3-3641-49de-a917-8c133c6b82e2" (UID: "89cfb8a3-3641-49de-a917-8c133c6b82e2"). InnerVolumeSpecName "kube-api-access-777m9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:18:12.009157 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.009124 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-777m9\" (UniqueName: \"kubernetes.io/projected/89cfb8a3-3641-49de-a917-8c133c6b82e2-kube-api-access-777m9\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:18:12.009157 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.009153 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-tmp\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:18:12.009157 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.009163 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-tokenizer-cache\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:18:12.009405 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.009173 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89cfb8a3-3641-49de-a917-8c133c6b82e2-kserve-provision-location\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:18:12.009405 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.009184 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89cfb8a3-3641-49de-a917-8c133c6b82e2-tls-certs\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:18:12.633797 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.633764 2564 generic.go:358] "Generic (PLEG): container finished" podID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerID="3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a" exitCode=0 Apr 16 14:18:12.634225 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.633843 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" Apr 16 14:18:12.634225 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.633849 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" event={"ID":"89cfb8a3-3641-49de-a917-8c133c6b82e2","Type":"ContainerDied","Data":"3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a"} Apr 16 14:18:12.634225 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.633891 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm" event={"ID":"89cfb8a3-3641-49de-a917-8c133c6b82e2","Type":"ContainerDied","Data":"b8b66bfb31a745b1af5821c67a578d856945a3a750c7013bd338da0d8d83d9c0"} Apr 16 14:18:12.634225 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.633912 2564 scope.go:117] "RemoveContainer" containerID="3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a" Apr 16 14:18:12.642889 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.642866 2564 scope.go:117] "RemoveContainer" containerID="7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226" Apr 16 14:18:12.657270 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.657246 2564 scope.go:117] "RemoveContainer" containerID="02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317" Apr 16 14:18:12.657592 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.657568 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm"] Apr 16 14:18:12.661491 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.661469 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fb6bc9fkkfkm"] Apr 16 14:18:12.665071 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.665051 2564 scope.go:117] "RemoveContainer" containerID="3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a" Apr 16 14:18:12.665350 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:18:12.665332 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a\": container with ID starting with 3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a not found: ID does not exist" containerID="3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a" Apr 16 14:18:12.665406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.665359 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a"} err="failed to get container status \"3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a\": rpc error: code = NotFound desc = could not find container \"3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a\": container with ID starting with 3ff864b3f1f5ae402a5931350104e5d50be906ce472b1f985b93e4075a1ccf6a not found: ID does not exist" Apr 16 14:18:12.665406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.665379 2564 scope.go:117] "RemoveContainer" containerID="7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226" Apr 16 14:18:12.665592 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:18:12.665576 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226\": container with ID starting with 7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226 not found: ID does not exist" containerID="7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226" Apr 16 14:18:12.665642 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.665596 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226"} err="failed to get container status \"7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226\": rpc error: code = NotFound desc = could not find container \"7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226\": container with ID starting with 7ae1f189a867481a118f2e24140dc73bbeb579bae5f69b314f131d5abed84226 not found: ID does not exist" Apr 16 14:18:12.665642 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.665626 2564 scope.go:117] "RemoveContainer" containerID="02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317" Apr 16 14:18:12.665828 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:18:12.665810 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317\": container with ID starting with 02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317 not found: ID does not exist" containerID="02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317" Apr 16 14:18:12.665873 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:12.665833 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317"} err="failed to get container status \"02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317\": rpc error: code = NotFound desc = could not find container \"02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317\": container with ID starting with 02d95c942d39a2a8dfb85b4668cc446018e14c508f6eacbb386f1a67b9a60317 not found: ID does not exist" Apr 16 14:18:13.776226 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:13.776172 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" path="/var/lib/kubelet/pods/89cfb8a3-3641-49de-a917-8c133c6b82e2/volumes" Apr 16 14:18:17.085844 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.085806 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz"] Apr 16 14:18:17.086325 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.086185 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="storage-initializer" Apr 16 14:18:17.086325 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.086198 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="storage-initializer" Apr 16 14:18:17.086325 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.086222 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="main" Apr 16 14:18:17.086325 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.086230 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="main" Apr 16 14:18:17.086325 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.086237 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="tokenizer" Apr 16 14:18:17.086325 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.086245 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="tokenizer" Apr 16 14:18:17.086520 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.086342 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="main" Apr 16 14:18:17.086520 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.086350 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="89cfb8a3-3641-49de-a917-8c133c6b82e2" containerName="tokenizer" Apr 16 14:18:17.089604 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.089588 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.093666 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.093639 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-ww7q5\"" Apr 16 14:18:17.093952 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.093922 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:18:17.094327 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.094311 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 14:18:17.101690 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.101667 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz"] Apr 16 14:18:17.168876 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.168840 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.169028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.168906 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.169028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.168975 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.169028 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.169006 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.169132 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.169088 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssnhj\" (UniqueName: \"kubernetes.io/projected/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kube-api-access-ssnhj\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.169166 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.169142 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270091 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270055 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270307 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270113 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270307 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270143 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270307 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270171 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270307 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270189 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270307 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270256 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssnhj\" (UniqueName: \"kubernetes.io/projected/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kube-api-access-ssnhj\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270564 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270535 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270564 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270552 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270665 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270601 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.270665 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.270631 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.272959 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.272936 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.282316 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.282294 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssnhj\" (UniqueName: \"kubernetes.io/projected/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kube-api-access-ssnhj\") pod \"router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.399910 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.399790 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:17.540149 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.540123 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz"] Apr 16 14:18:17.542468 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:18:17.542428 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6f7f55f_da9f_433a_990a_28c1a70c5e18.slice/crio-c1cbc628ee34330da6164d12ecf83089d5db2b3b5571800c3e433f21ddea2826 WatchSource:0}: Error finding container c1cbc628ee34330da6164d12ecf83089d5db2b3b5571800c3e433f21ddea2826: Status 404 returned error can't find the container with id c1cbc628ee34330da6164d12ecf83089d5db2b3b5571800c3e433f21ddea2826 Apr 16 14:18:17.544261 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.544245 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:18:17.653935 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.653847 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" event={"ID":"e6f7f55f-da9f-433a-990a-28c1a70c5e18","Type":"ContainerStarted","Data":"a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453"} Apr 16 14:18:17.653935 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:17.653888 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" event={"ID":"e6f7f55f-da9f-433a-990a-28c1a70c5e18","Type":"ContainerStarted","Data":"c1cbc628ee34330da6164d12ecf83089d5db2b3b5571800c3e433f21ddea2826"} Apr 16 14:18:18.658272 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:18.658159 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerID="a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453" exitCode=0 Apr 16 14:18:18.658272 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:18.658239 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" event={"ID":"e6f7f55f-da9f-433a-990a-28c1a70c5e18","Type":"ContainerDied","Data":"a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453"} Apr 16 14:18:19.664311 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:19.664275 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" event={"ID":"e6f7f55f-da9f-433a-990a-28c1a70c5e18","Type":"ContainerStarted","Data":"3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff"} Apr 16 14:18:19.664311 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:19.664313 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" event={"ID":"e6f7f55f-da9f-433a-990a-28c1a70c5e18","Type":"ContainerStarted","Data":"b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854"} Apr 16 14:18:19.664719 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:19.664412 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:19.685766 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:19.685710 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" podStartSLOduration=2.685690719 podStartE2EDuration="2.685690719s" podCreationTimestamp="2026-04-16 14:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:18:19.683899186 +0000 UTC m=+1136.503264539" watchObservedRunningTime="2026-04-16 14:18:19.685690719 +0000 UTC m=+1136.505056074" Apr 16 14:18:27.400330 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:27.400292 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:27.400330 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:27.400334 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:27.403246 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:27.403219 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:27.695155 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:27.695076 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:18:48.699103 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:18:48.699019 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:20:37.180916 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:37.180878 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz"] Apr 16 14:20:37.183453 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:37.181306 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="main" containerID="cri-o://b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854" gracePeriod=30 Apr 16 14:20:37.183453 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:37.181344 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="tokenizer" containerID="cri-o://3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff" gracePeriod=30 Apr 16 14:20:37.694826 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:37.694776 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.39:8082/healthz\": dial tcp 10.133.0.39:8082: connect: connection refused" Apr 16 14:20:38.139550 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.139509 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerID="b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854" exitCode=0 Apr 16 14:20:38.139731 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.139576 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" event={"ID":"e6f7f55f-da9f-433a-990a-28c1a70c5e18","Type":"ContainerDied","Data":"b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854"} Apr 16 14:20:38.425777 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.425755 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:20:38.484426 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484398 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-uds\") pod \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " Apr 16 14:20:38.484591 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484486 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kserve-provision-location\") pod \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " Apr 16 14:20:38.484591 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484557 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tls-certs\") pod \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " Apr 16 14:20:38.484713 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484596 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-cache\") pod \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " Apr 16 14:20:38.484713 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484623 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-tmp\") pod \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " Apr 16 14:20:38.484713 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484638 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssnhj\" (UniqueName: \"kubernetes.io/projected/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kube-api-access-ssnhj\") pod \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\" (UID: \"e6f7f55f-da9f-433a-990a-28c1a70c5e18\") " Apr 16 14:20:38.484713 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484640 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e6f7f55f-da9f-433a-990a-28c1a70c5e18" (UID: "e6f7f55f-da9f-433a-990a-28c1a70c5e18"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:38.484928 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484849 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-uds\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:20:38.484928 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484848 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e6f7f55f-da9f-433a-990a-28c1a70c5e18" (UID: "e6f7f55f-da9f-433a-990a-28c1a70c5e18"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:38.485041 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.484934 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e6f7f55f-da9f-433a-990a-28c1a70c5e18" (UID: "e6f7f55f-da9f-433a-990a-28c1a70c5e18"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:38.485272 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.485249 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e6f7f55f-da9f-433a-990a-28c1a70c5e18" (UID: "e6f7f55f-da9f-433a-990a-28c1a70c5e18"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:38.486650 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.486625 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e6f7f55f-da9f-433a-990a-28c1a70c5e18" (UID: "e6f7f55f-da9f-433a-990a-28c1a70c5e18"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:20:38.486737 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.486674 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kube-api-access-ssnhj" (OuterVolumeSpecName: "kube-api-access-ssnhj") pod "e6f7f55f-da9f-433a-990a-28c1a70c5e18" (UID: "e6f7f55f-da9f-433a-990a-28c1a70c5e18"). InnerVolumeSpecName "kube-api-access-ssnhj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:20:38.585376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.585341 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kserve-provision-location\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:20:38.585376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.585371 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tls-certs\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:20:38.585376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.585383 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-cache\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:20:38.585612 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.585392 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6f7f55f-da9f-433a-990a-28c1a70c5e18-tokenizer-tmp\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:20:38.585612 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:38.585401 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ssnhj\" (UniqueName: \"kubernetes.io/projected/e6f7f55f-da9f-433a-990a-28c1a70c5e18-kube-api-access-ssnhj\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:20:39.146047 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.146012 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerID="3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff" exitCode=0 Apr 16 14:20:39.146242 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.146088 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" Apr 16 14:20:39.146242 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.146099 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" event={"ID":"e6f7f55f-da9f-433a-990a-28c1a70c5e18","Type":"ContainerDied","Data":"3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff"} Apr 16 14:20:39.146242 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.146142 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz" event={"ID":"e6f7f55f-da9f-433a-990a-28c1a70c5e18","Type":"ContainerDied","Data":"c1cbc628ee34330da6164d12ecf83089d5db2b3b5571800c3e433f21ddea2826"} Apr 16 14:20:39.146242 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.146161 2564 scope.go:117] "RemoveContainer" containerID="3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff" Apr 16 14:20:39.155681 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.155659 2564 scope.go:117] "RemoveContainer" containerID="b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854" Apr 16 14:20:39.163278 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.163261 2564 scope.go:117] "RemoveContainer" containerID="a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453" Apr 16 14:20:39.169155 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.169132 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz"] Apr 16 14:20:39.171165 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.171150 2564 scope.go:117] "RemoveContainer" containerID="3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff" Apr 16 14:20:39.171438 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:20:39.171422 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff\": container with ID starting with 3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff not found: ID does not exist" containerID="3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff" Apr 16 14:20:39.171501 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.171447 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff"} err="failed to get container status \"3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff\": rpc error: code = NotFound desc = could not find container \"3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff\": container with ID starting with 3bf1e504fcb00681acd471f7e81e365479d754708eac563adf3b3d9373adb4ff not found: ID does not exist" Apr 16 14:20:39.171501 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.171464 2564 scope.go:117] "RemoveContainer" containerID="b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854" Apr 16 14:20:39.171692 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:20:39.171675 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854\": container with ID starting with b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854 not found: ID does not exist" containerID="b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854" Apr 16 14:20:39.171729 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.171703 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854"} err="failed to get container status \"b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854\": rpc error: code = NotFound desc = could not find container \"b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854\": container with ID starting with b549e72a2d0b1be09cd4afce95c8ef4dcab25eb93ca1167da90c4d2f67a04854 not found: ID does not exist" Apr 16 14:20:39.171768 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.171733 2564 scope.go:117] "RemoveContainer" containerID="a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453" Apr 16 14:20:39.171978 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:20:39.171960 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453\": container with ID starting with a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453 not found: ID does not exist" containerID="a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453" Apr 16 14:20:39.172040 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.171983 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453"} err="failed to get container status \"a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453\": rpc error: code = NotFound desc = could not find container \"a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453\": container with ID starting with a0dbac024deddc15c8f8156985ab79896ce012929d4d8a3f7371ebed07fdd453 not found: ID does not exist" Apr 16 14:20:39.175352 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.175333 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6d5b779858-t5rvz"] Apr 16 14:20:39.774835 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:20:39.774798 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" path="/var/lib/kubelet/pods/e6f7f55f-da9f-433a-990a-28c1a70c5e18/volumes" Apr 16 14:22:14.496272 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.496238 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88"] Apr 16 14:22:14.496843 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.496822 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="main" Apr 16 14:22:14.496921 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.496846 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="main" Apr 16 14:22:14.496921 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.496869 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="tokenizer" Apr 16 14:22:14.496921 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.496878 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="tokenizer" Apr 16 14:22:14.496921 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.496911 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="storage-initializer" Apr 16 14:22:14.496921 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.496921 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="storage-initializer" Apr 16 14:22:14.497173 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.497027 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="tokenizer" Apr 16 14:22:14.497173 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.497048 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6f7f55f-da9f-433a-990a-28c1a70c5e18" containerName="main" Apr 16 14:22:14.500496 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.500469 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.503344 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.503321 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:22:14.503613 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.503596 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-pct8g\"" Apr 16 14:22:14.504329 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.504299 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 14:22:14.514431 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.514409 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88"] Apr 16 14:22:14.673036 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.672999 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.673248 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.673057 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.673248 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.673086 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.673248 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.673128 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.673248 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.673197 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kube-api-access-gkckt\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.673248 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.673237 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.773757 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.773678 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kube-api-access-gkckt\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.773757 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.773712 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.773757 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.773751 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.774045 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.773781 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.774045 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.773802 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.774045 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.773835 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.774255 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.774231 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.774255 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.774245 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.774358 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.774278 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.774358 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.774320 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.776270 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.776254 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.783120 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.783086 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kube-api-access-gkckt\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.812163 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.812126 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:14.938545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:14.938512 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88"] Apr 16 14:22:14.941910 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:22:14.941881 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872faa42_6559_4bcb_8aaf_769cfa17ebd7.slice/crio-1bc16996d2a5d6b46008981a7c15ab31d6e558d41b539387060c1cfe3714a97f WatchSource:0}: Error finding container 1bc16996d2a5d6b46008981a7c15ab31d6e558d41b539387060c1cfe3714a97f: Status 404 returned error can't find the container with id 1bc16996d2a5d6b46008981a7c15ab31d6e558d41b539387060c1cfe3714a97f Apr 16 14:22:15.481023 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:15.480982 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" event={"ID":"872faa42-6559-4bcb-8aaf-769cfa17ebd7","Type":"ContainerStarted","Data":"51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8"} Apr 16 14:22:15.481023 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:15.481019 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" event={"ID":"872faa42-6559-4bcb-8aaf-769cfa17ebd7","Type":"ContainerStarted","Data":"1bc16996d2a5d6b46008981a7c15ab31d6e558d41b539387060c1cfe3714a97f"} Apr 16 14:22:16.485474 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:16.485439 2564 generic.go:358] "Generic (PLEG): container finished" podID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerID="51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8" exitCode=0 Apr 16 14:22:16.485877 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:16.485528 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" event={"ID":"872faa42-6559-4bcb-8aaf-769cfa17ebd7","Type":"ContainerDied","Data":"51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8"} Apr 16 14:22:17.490870 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:17.490828 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" event={"ID":"872faa42-6559-4bcb-8aaf-769cfa17ebd7","Type":"ContainerStarted","Data":"51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e"} Apr 16 14:22:17.491269 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:17.490881 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" event={"ID":"872faa42-6559-4bcb-8aaf-769cfa17ebd7","Type":"ContainerStarted","Data":"9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6"} Apr 16 14:22:17.491269 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:17.490926 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:17.513305 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:17.513254 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" podStartSLOduration=3.513237254 podStartE2EDuration="3.513237254s" podCreationTimestamp="2026-04-16 14:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:22:17.511070709 +0000 UTC m=+1374.330436060" watchObservedRunningTime="2026-04-16 14:22:17.513237254 +0000 UTC m=+1374.332602609" Apr 16 14:22:24.813076 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:24.813035 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:24.813076 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:24.813083 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:24.815774 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:24.815753 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:25.522078 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:25.522048 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:22:46.525836 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:22:46.525803 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:24:01.669822 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.669790 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d"] Apr 16 14:24:01.673805 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.673784 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.676620 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.676596 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-z6zjf\"" Apr 16 14:24:01.676775 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.676753 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 14:24:01.684307 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.684283 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d"] Apr 16 14:24:01.764563 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.764523 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.764747 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.764595 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.764747 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.764690 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.764747 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.764727 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj2jh\" (UniqueName: \"kubernetes.io/projected/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kube-api-access-rj2jh\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.764876 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.764762 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.764876 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.764786 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866068 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866038 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rj2jh\" (UniqueName: \"kubernetes.io/projected/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kube-api-access-rj2jh\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866260 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866086 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866345 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866252 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866345 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866322 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866395 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866524 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866481 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866524 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866510 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866691 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866668 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866768 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866716 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.866861 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.866841 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.869049 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.869027 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.873782 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.873762 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj2jh\" (UniqueName: \"kubernetes.io/projected/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kube-api-access-rj2jh\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:01.983411 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:01.983325 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:02.113845 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:02.113817 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d"] Apr 16 14:24:02.115703 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:24:02.115679 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f51460_4c5b_4ec7_89f7_adbbb235ef50.slice/crio-9f1c2377cfc2a24183b6a0b21e9d98f676369e22a4db3b30e9874ccb2f6092c0 WatchSource:0}: Error finding container 9f1c2377cfc2a24183b6a0b21e9d98f676369e22a4db3b30e9874ccb2f6092c0: Status 404 returned error can't find the container with id 9f1c2377cfc2a24183b6a0b21e9d98f676369e22a4db3b30e9874ccb2f6092c0 Apr 16 14:24:02.117619 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:02.117601 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:24:02.859491 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:02.859456 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" event={"ID":"70f51460-4c5b-4ec7-89f7-adbbb235ef50","Type":"ContainerStarted","Data":"73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64"} Apr 16 14:24:02.859491 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:02.859494 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" event={"ID":"70f51460-4c5b-4ec7-89f7-adbbb235ef50","Type":"ContainerStarted","Data":"9f1c2377cfc2a24183b6a0b21e9d98f676369e22a4db3b30e9874ccb2f6092c0"} Apr 16 14:24:03.864413 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:03.864379 2564 generic.go:358] "Generic (PLEG): container finished" podID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerID="73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64" exitCode=0 Apr 16 14:24:03.864807 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:03.864468 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" event={"ID":"70f51460-4c5b-4ec7-89f7-adbbb235ef50","Type":"ContainerDied","Data":"73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64"} Apr 16 14:24:04.869713 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:04.869676 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" event={"ID":"70f51460-4c5b-4ec7-89f7-adbbb235ef50","Type":"ContainerStarted","Data":"53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe"} Apr 16 14:24:04.869713 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:04.869713 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" event={"ID":"70f51460-4c5b-4ec7-89f7-adbbb235ef50","Type":"ContainerStarted","Data":"f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6"} Apr 16 14:24:04.870133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:04.869827 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:04.891225 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:04.891154 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" podStartSLOduration=3.891140688 podStartE2EDuration="3.891140688s" podCreationTimestamp="2026-04-16 14:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:24:04.888046908 +0000 UTC m=+1481.707412261" watchObservedRunningTime="2026-04-16 14:24:04.891140688 +0000 UTC m=+1481.710506039" Apr 16 14:24:11.983750 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:11.983715 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:11.984621 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:11.983763 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:11.986752 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:11.986724 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:12.900473 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:12.900443 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:33.903688 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:33.903658 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:24:47.558150 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:47.558073 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88"] Apr 16 14:24:47.558637 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:47.558398 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="main" containerID="cri-o://9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6" gracePeriod=30 Apr 16 14:24:47.558637 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:47.558450 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="tokenizer" containerID="cri-o://51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e" gracePeriod=30 Apr 16 14:24:48.031608 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.031561 2564 generic.go:358] "Generic (PLEG): container finished" podID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerID="9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6" exitCode=0 Apr 16 14:24:48.031814 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.031627 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" event={"ID":"872faa42-6559-4bcb-8aaf-769cfa17ebd7","Type":"ContainerDied","Data":"9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6"} Apr 16 14:24:48.810296 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.810274 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:24:48.916424 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.916348 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-tmp\") pod \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " Apr 16 14:24:48.916564 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.916463 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-cache\") pod \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " Apr 16 14:24:48.916564 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.916503 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kube-api-access-gkckt\") pod \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " Apr 16 14:24:48.916564 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.916538 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tls-certs\") pod \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " Apr 16 14:24:48.916705 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.916603 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-uds\") pod \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " Apr 16 14:24:48.916771 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.916716 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "872faa42-6559-4bcb-8aaf-769cfa17ebd7" (UID: "872faa42-6559-4bcb-8aaf-769cfa17ebd7"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:48.916771 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.916742 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kserve-provision-location\") pod \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\" (UID: \"872faa42-6559-4bcb-8aaf-769cfa17ebd7\") " Apr 16 14:24:48.916877 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.916749 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "872faa42-6559-4bcb-8aaf-769cfa17ebd7" (UID: "872faa42-6559-4bcb-8aaf-769cfa17ebd7"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:48.916877 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.916818 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "872faa42-6559-4bcb-8aaf-769cfa17ebd7" (UID: "872faa42-6559-4bcb-8aaf-769cfa17ebd7"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:48.917072 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.917056 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-uds\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:24:48.917139 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.917076 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-tmp\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:24:48.917139 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.917085 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tokenizer-cache\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:24:48.917366 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.917342 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "872faa42-6559-4bcb-8aaf-769cfa17ebd7" (UID: "872faa42-6559-4bcb-8aaf-769cfa17ebd7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:48.918731 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.918709 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kube-api-access-gkckt" (OuterVolumeSpecName: "kube-api-access-gkckt") pod "872faa42-6559-4bcb-8aaf-769cfa17ebd7" (UID: "872faa42-6559-4bcb-8aaf-769cfa17ebd7"). InnerVolumeSpecName "kube-api-access-gkckt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:24:48.918785 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:48.918726 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "872faa42-6559-4bcb-8aaf-769cfa17ebd7" (UID: "872faa42-6559-4bcb-8aaf-769cfa17ebd7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:24:49.018044 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.018011 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kube-api-access-gkckt\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:24:49.018044 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.018040 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/872faa42-6559-4bcb-8aaf-769cfa17ebd7-tls-certs\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:24:49.018044 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.018051 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/872faa42-6559-4bcb-8aaf-769cfa17ebd7-kserve-provision-location\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:24:49.037070 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.037035 2564 generic.go:358] "Generic (PLEG): container finished" podID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerID="51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e" exitCode=0 Apr 16 14:24:49.037266 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.037110 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" Apr 16 14:24:49.037266 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.037107 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" event={"ID":"872faa42-6559-4bcb-8aaf-769cfa17ebd7","Type":"ContainerDied","Data":"51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e"} Apr 16 14:24:49.037266 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.037239 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88" event={"ID":"872faa42-6559-4bcb-8aaf-769cfa17ebd7","Type":"ContainerDied","Data":"1bc16996d2a5d6b46008981a7c15ab31d6e558d41b539387060c1cfe3714a97f"} Apr 16 14:24:49.037266 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.037260 2564 scope.go:117] "RemoveContainer" containerID="51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e" Apr 16 14:24:49.046021 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.046004 2564 scope.go:117] "RemoveContainer" containerID="9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6" Apr 16 14:24:49.053794 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.053776 2564 scope.go:117] "RemoveContainer" containerID="51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8" Apr 16 14:24:49.059316 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.059292 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88"] Apr 16 14:24:49.062115 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.062097 2564 scope.go:117] "RemoveContainer" containerID="51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e" Apr 16 14:24:49.062446 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:24:49.062423 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e\": container with ID starting with 51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e not found: ID does not exist" containerID="51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e" Apr 16 14:24:49.062558 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.062452 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e"} err="failed to get container status \"51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e\": rpc error: code = NotFound desc = could not find container \"51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e\": container with ID starting with 51bedc6e8aa927c13859594d7253b9ac52393907345d6986d1b73a38a6c0964e not found: ID does not exist" Apr 16 14:24:49.062558 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.062474 2564 scope.go:117] "RemoveContainer" containerID="9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6" Apr 16 14:24:49.062725 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:24:49.062706 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6\": container with ID starting with 9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6 not found: ID does not exist" containerID="9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6" Apr 16 14:24:49.062841 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.062732 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6"} err="failed to get container status \"9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6\": rpc error: code = NotFound desc = could not find container \"9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6\": container with ID starting with 9160e0f170a490764de31d8d412a58f2ba50e7ffe12fd866b83e9580efa043a6 not found: ID does not exist" Apr 16 14:24:49.062841 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.062752 2564 scope.go:117] "RemoveContainer" containerID="51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8" Apr 16 14:24:49.063002 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:24:49.062986 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8\": container with ID starting with 51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8 not found: ID does not exist" containerID="51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8" Apr 16 14:24:49.063055 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.063010 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8"} err="failed to get container status \"51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8\": rpc error: code = NotFound desc = could not find container \"51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8\": container with ID starting with 51c954de8baa6939edd3e3c40633c180a3cd9d06797967e570d430615f6419e8 not found: ID does not exist" Apr 16 14:24:49.064072 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.064052 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schecdp88"] Apr 16 14:24:49.775317 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:24:49.775287 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" path="/var/lib/kubelet/pods/872faa42-6559-4bcb-8aaf-769cfa17ebd7/volumes" Apr 16 14:26:04.378926 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:04.378893 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d"] Apr 16 14:26:04.379371 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:04.379239 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="main" containerID="cri-o://f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6" gracePeriod=30 Apr 16 14:26:04.379371 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:04.379297 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="tokenizer" containerID="cri-o://53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe" gracePeriod=30 Apr 16 14:26:05.312021 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.311987 2564 generic.go:358] "Generic (PLEG): container finished" podID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerID="f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6" exitCode=0 Apr 16 14:26:05.312227 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.312038 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" event={"ID":"70f51460-4c5b-4ec7-89f7-adbbb235ef50","Type":"ContainerDied","Data":"f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6"} Apr 16 14:26:05.635656 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.635633 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:26:05.720732 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.720695 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-cache\") pod \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " Apr 16 14:26:05.720732 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.720735 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-uds\") pod \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " Apr 16 14:26:05.721035 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.720753 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tls-certs\") pod \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " Apr 16 14:26:05.721035 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.720777 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-tmp\") pod \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " Apr 16 14:26:05.721035 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.720817 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kserve-provision-location\") pod \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " Apr 16 14:26:05.721035 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.720879 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj2jh\" (UniqueName: \"kubernetes.io/projected/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kube-api-access-rj2jh\") pod \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\" (UID: \"70f51460-4c5b-4ec7-89f7-adbbb235ef50\") " Apr 16 14:26:05.721262 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.721026 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "70f51460-4c5b-4ec7-89f7-adbbb235ef50" (UID: "70f51460-4c5b-4ec7-89f7-adbbb235ef50"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:05.721262 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.721043 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "70f51460-4c5b-4ec7-89f7-adbbb235ef50" (UID: "70f51460-4c5b-4ec7-89f7-adbbb235ef50"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:05.721262 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.721151 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "70f51460-4c5b-4ec7-89f7-adbbb235ef50" (UID: "70f51460-4c5b-4ec7-89f7-adbbb235ef50"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:05.721440 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.721283 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-cache\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:26:05.721440 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.721296 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-uds\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:26:05.721440 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.721305 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tokenizer-tmp\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:26:05.721620 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.721597 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "70f51460-4c5b-4ec7-89f7-adbbb235ef50" (UID: "70f51460-4c5b-4ec7-89f7-adbbb235ef50"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:05.722921 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.722892 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "70f51460-4c5b-4ec7-89f7-adbbb235ef50" (UID: "70f51460-4c5b-4ec7-89f7-adbbb235ef50"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:26:05.723033 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.722970 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kube-api-access-rj2jh" (OuterVolumeSpecName: "kube-api-access-rj2jh") pod "70f51460-4c5b-4ec7-89f7-adbbb235ef50" (UID: "70f51460-4c5b-4ec7-89f7-adbbb235ef50"). InnerVolumeSpecName "kube-api-access-rj2jh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:26:05.821951 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.821912 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rj2jh\" (UniqueName: \"kubernetes.io/projected/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kube-api-access-rj2jh\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:26:05.821951 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.821945 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/70f51460-4c5b-4ec7-89f7-adbbb235ef50-tls-certs\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:26:05.821951 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:05.821955 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70f51460-4c5b-4ec7-89f7-adbbb235ef50-kserve-provision-location\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:26:06.317703 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.317629 2564 generic.go:358] "Generic (PLEG): container finished" podID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerID="53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe" exitCode=0 Apr 16 14:26:06.317703 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.317668 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" event={"ID":"70f51460-4c5b-4ec7-89f7-adbbb235ef50","Type":"ContainerDied","Data":"53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe"} Apr 16 14:26:06.317913 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.317704 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" Apr 16 14:26:06.317913 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.317719 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d" event={"ID":"70f51460-4c5b-4ec7-89f7-adbbb235ef50","Type":"ContainerDied","Data":"9f1c2377cfc2a24183b6a0b21e9d98f676369e22a4db3b30e9874ccb2f6092c0"} Apr 16 14:26:06.317913 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.317739 2564 scope.go:117] "RemoveContainer" containerID="53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe" Apr 16 14:26:06.329545 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.329524 2564 scope.go:117] "RemoveContainer" containerID="f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6" Apr 16 14:26:06.336285 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.336260 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d"] Apr 16 14:26:06.337785 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.337750 2564 scope.go:117] "RemoveContainer" containerID="73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64" Apr 16 14:26:06.339974 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.339955 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d554ddz4d"] Apr 16 14:26:06.345360 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.345272 2564 scope.go:117] "RemoveContainer" containerID="53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe" Apr 16 14:26:06.345560 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:26:06.345541 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe\": container with ID starting with 53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe not found: ID does not exist" containerID="53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe" Apr 16 14:26:06.345610 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.345567 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe"} err="failed to get container status \"53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe\": rpc error: code = NotFound desc = could not find container \"53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe\": container with ID starting with 53f6019bfe582b1f6d67976af5c2ac03cc9f3a9f3a82aeda64be5857d9ec2efe not found: ID does not exist" Apr 16 14:26:06.345610 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.345585 2564 scope.go:117] "RemoveContainer" containerID="f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6" Apr 16 14:26:06.345797 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:26:06.345782 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6\": container with ID starting with f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6 not found: ID does not exist" containerID="f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6" Apr 16 14:26:06.345845 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.345802 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6"} err="failed to get container status \"f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6\": rpc error: code = NotFound desc = could not find container \"f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6\": container with ID starting with f0dfb926c0a678f6fc46bc9e1fecfb30b418b6061e400cfe4cc9470202d367d6 not found: ID does not exist" Apr 16 14:26:06.345845 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.345816 2564 scope.go:117] "RemoveContainer" containerID="73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64" Apr 16 14:26:06.345980 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:26:06.345962 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64\": container with ID starting with 73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64 not found: ID does not exist" containerID="73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64" Apr 16 14:26:06.346048 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:06.345985 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64"} err="failed to get container status \"73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64\": rpc error: code = NotFound desc = could not find container \"73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64\": container with ID starting with 73bc4a37fd176abd8c63610d2d7e94b1f8e63974bff9f30ed47d40f363623d64 not found: ID does not exist" Apr 16 14:26:07.774382 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:07.774349 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" path="/var/lib/kubelet/pods/70f51460-4c5b-4ec7-89f7-adbbb235ef50/volumes" Apr 16 14:26:27.892894 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.892860 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h"] Apr 16 14:26:27.893449 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893431 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="main" Apr 16 14:26:27.893543 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893452 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="main" Apr 16 14:26:27.893543 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893475 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="storage-initializer" Apr 16 14:26:27.893543 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893483 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="storage-initializer" Apr 16 14:26:27.893543 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893492 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="storage-initializer" Apr 16 14:26:27.893543 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893502 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="storage-initializer" Apr 16 14:26:27.893543 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893523 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="tokenizer" Apr 16 14:26:27.893543 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893531 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="tokenizer" Apr 16 14:26:27.893543 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893542 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="main" Apr 16 14:26:27.893945 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893550 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="main" Apr 16 14:26:27.893945 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893567 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="tokenizer" Apr 16 14:26:27.893945 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893576 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="tokenizer" Apr 16 14:26:27.893945 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893669 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="tokenizer" Apr 16 14:26:27.893945 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893682 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="main" Apr 16 14:26:27.893945 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893694 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="70f51460-4c5b-4ec7-89f7-adbbb235ef50" containerName="tokenizer" Apr 16 14:26:27.893945 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.893705 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="872faa42-6559-4bcb-8aaf-769cfa17ebd7" containerName="main" Apr 16 14:26:27.898272 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.898253 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:27.900869 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.900851 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 14:26:27.901327 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.901310 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-pqtcv\"" Apr 16 14:26:27.901855 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.901831 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:26:27.909783 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:27.909761 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h"] Apr 16 14:26:28.025275 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.025240 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqlsn\" (UniqueName: \"kubernetes.io/projected/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kube-api-access-jqlsn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.025448 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.025328 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.025448 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.025370 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.025448 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.025400 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.025448 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.025440 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.025585 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.025505 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.126830 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.126792 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.126985 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.126853 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqlsn\" (UniqueName: \"kubernetes.io/projected/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kube-api-access-jqlsn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.126985 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.126886 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.126985 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.126904 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.126985 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.126926 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.126985 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.126956 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.127313 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.127288 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.127385 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.127297 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.127385 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.127356 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.127478 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.127398 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.129400 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.129382 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.135408 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.135386 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqlsn\" (UniqueName: \"kubernetes.io/projected/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kube-api-access-jqlsn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.207644 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.207568 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:28.339000 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.338977 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h"] Apr 16 14:26:28.340997 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:26:28.340964 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd7ac30_7b36_46f6_9a02_bd9f4ae9946a.slice/crio-42d5e72ca764bcf401ec194029062e9e630aea5619d921e65a4a7414a5ff2175 WatchSource:0}: Error finding container 42d5e72ca764bcf401ec194029062e9e630aea5619d921e65a4a7414a5ff2175: Status 404 returned error can't find the container with id 42d5e72ca764bcf401ec194029062e9e630aea5619d921e65a4a7414a5ff2175 Apr 16 14:26:28.392517 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:28.392484 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" event={"ID":"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a","Type":"ContainerStarted","Data":"42d5e72ca764bcf401ec194029062e9e630aea5619d921e65a4a7414a5ff2175"} Apr 16 14:26:29.397918 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:29.397885 2564 generic.go:358] "Generic (PLEG): container finished" podID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerID="08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3" exitCode=0 Apr 16 14:26:29.398332 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:29.397978 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" event={"ID":"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a","Type":"ContainerDied","Data":"08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3"} Apr 16 14:26:30.410690 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:30.410656 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" event={"ID":"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a","Type":"ContainerStarted","Data":"3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848"} Apr 16 14:26:30.410690 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:30.410692 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" event={"ID":"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a","Type":"ContainerStarted","Data":"01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e"} Apr 16 14:26:30.411112 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:30.410859 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:30.432927 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:30.432883 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" podStartSLOduration=3.432870443 podStartE2EDuration="3.432870443s" podCreationTimestamp="2026-04-16 14:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:26:30.431153217 +0000 UTC m=+1627.250518568" watchObservedRunningTime="2026-04-16 14:26:30.432870443 +0000 UTC m=+1627.252235794" Apr 16 14:26:38.208347 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:38.208304 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:38.208841 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:38.208361 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:38.211061 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:38.211034 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:38.442969 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:38.442943 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:26:59.447612 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:26:59.447583 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:29:20.465914 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:20.465873 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h"] Apr 16 14:29:20.466366 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:20.466198 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="main" containerID="cri-o://01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e" gracePeriod=30 Apr 16 14:29:20.466366 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:20.466249 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="tokenizer" containerID="cri-o://3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848" gracePeriod=30 Apr 16 14:29:21.013990 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.013952 2564 generic.go:358] "Generic (PLEG): container finished" podID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerID="01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e" exitCode=0 Apr 16 14:29:21.014197 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.014027 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" event={"ID":"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a","Type":"ContainerDied","Data":"01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e"} Apr 16 14:29:21.725775 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.725753 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:29:21.832641 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.832613 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-cache\") pod \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " Apr 16 14:29:21.832790 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.832647 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-tmp\") pod \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " Apr 16 14:29:21.832790 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.832676 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-uds\") pod \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " Apr 16 14:29:21.832790 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.832709 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqlsn\" (UniqueName: \"kubernetes.io/projected/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kube-api-access-jqlsn\") pod \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " Apr 16 14:29:21.832790 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.832747 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tls-certs\") pod \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " Apr 16 14:29:21.833022 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.832799 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kserve-provision-location\") pod \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\" (UID: \"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a\") " Apr 16 14:29:21.833022 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.832910 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" (UID: "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:21.833022 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.832933 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" (UID: "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:21.833175 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.833026 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" (UID: "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:21.833266 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.833250 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-cache\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:29:21.833311 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.833267 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-tmp\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:29:21.833311 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.833279 2564 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tokenizer-uds\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:29:21.833514 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.833493 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" (UID: "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:21.835001 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.834968 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" (UID: "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:29:21.835001 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.834967 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kube-api-access-jqlsn" (OuterVolumeSpecName: "kube-api-access-jqlsn") pod "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" (UID: "3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a"). InnerVolumeSpecName "kube-api-access-jqlsn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:29:21.934528 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.934490 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jqlsn\" (UniqueName: \"kubernetes.io/projected/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kube-api-access-jqlsn\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:29:21.934528 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.934524 2564 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-tls-certs\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:29:21.934528 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:21.934535 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a-kserve-provision-location\") on node \"ip-10-0-140-244.ec2.internal\" DevicePath \"\"" Apr 16 14:29:22.019432 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.019400 2564 generic.go:358] "Generic (PLEG): container finished" podID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerID="3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848" exitCode=0 Apr 16 14:29:22.019597 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.019467 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" Apr 16 14:29:22.019597 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.019480 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" event={"ID":"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a","Type":"ContainerDied","Data":"3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848"} Apr 16 14:29:22.019597 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.019523 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h" event={"ID":"3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a","Type":"ContainerDied","Data":"42d5e72ca764bcf401ec194029062e9e630aea5619d921e65a4a7414a5ff2175"} Apr 16 14:29:22.019597 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.019540 2564 scope.go:117] "RemoveContainer" containerID="3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848" Apr 16 14:29:22.027996 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.027975 2564 scope.go:117] "RemoveContainer" containerID="01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e" Apr 16 14:29:22.035294 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.035275 2564 scope.go:117] "RemoveContainer" containerID="08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3" Apr 16 14:29:22.041553 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.041527 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h"] Apr 16 14:29:22.043350 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.043333 2564 scope.go:117] "RemoveContainer" containerID="3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848" Apr 16 14:29:22.043409 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.043394 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cc687bf4b889h"] Apr 16 14:29:22.043608 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:29:22.043590 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848\": container with ID starting with 3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848 not found: ID does not exist" containerID="3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848" Apr 16 14:29:22.043650 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.043618 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848"} err="failed to get container status \"3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848\": rpc error: code = NotFound desc = could not find container \"3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848\": container with ID starting with 3a515ccce9b4ff4893d720406df1fbc821d41b8d057e5c8482d63eb7f5bf5848 not found: ID does not exist" Apr 16 14:29:22.043650 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.043636 2564 scope.go:117] "RemoveContainer" containerID="01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e" Apr 16 14:29:22.043876 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:29:22.043859 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e\": container with ID starting with 01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e not found: ID does not exist" containerID="01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e" Apr 16 14:29:22.043927 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.043879 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e"} err="failed to get container status \"01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e\": rpc error: code = NotFound desc = could not find container \"01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e\": container with ID starting with 01bae1d1559a09543c162423a0be5d71820d1ff21e2e611e2c1552171e53836e not found: ID does not exist" Apr 16 14:29:22.043927 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.043892 2564 scope.go:117] "RemoveContainer" containerID="08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3" Apr 16 14:29:22.044097 ip-10-0-140-244 kubenswrapper[2564]: E0416 14:29:22.044082 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3\": container with ID starting with 08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3 not found: ID does not exist" containerID="08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3" Apr 16 14:29:22.044136 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:22.044101 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3"} err="failed to get container status \"08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3\": rpc error: code = NotFound desc = could not find container \"08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3\": container with ID starting with 08d575cda386fcd128e687fa643ef8a5aa29aef823973d35c94d3f5dacd1d5f3 not found: ID does not exist" Apr 16 14:29:23.775033 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:23.775002 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" path="/var/lib/kubelet/pods/3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a/volumes" Apr 16 14:29:36.704189 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:36.704161 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:37.822861 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:37.822832 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:38.897632 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:38.897605 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:39.939501 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:39.939472 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:40.971533 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:40.971502 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:42.007727 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:42.007699 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:43.132696 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:43.132665 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:44.202429 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:44.202402 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:45.326808 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:45.326778 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:46.358368 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:46.358338 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:47.391511 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:47.391487 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:48.474719 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:48.474690 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:49.581449 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:49.581417 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:50.627425 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:50.627393 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-p5f4j_d8f3bbe7-bfa8-4d5e-8897-fc29dc36e324/istio-proxy/0.log" Apr 16 14:29:51.749484 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:51.749454 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-4npv7_3bb5710b-5550-4097-9967-7df2e8c5b723/istio-proxy/0.log" Apr 16 14:29:51.765632 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:51.765606 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-54469996f6-6h8nf_776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1/router/0.log" Apr 16 14:29:52.627309 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:52.627279 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-4npv7_3bb5710b-5550-4097-9967-7df2e8c5b723/istio-proxy/0.log" Apr 16 14:29:52.645171 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:52.645144 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-54469996f6-6h8nf_776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1/router/0.log" Apr 16 14:29:53.434469 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:53.434440 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-wc2qz_d8bb742e-925e-4a94-8b9b-767c561d2f49/authorino/0.log" Apr 16 14:29:53.451915 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:29:53.451875 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-p4qgl_fe191a4e-dcc4-4632-b3df-df170611b410/manager/0.log" Apr 16 14:30:01.198854 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:01.198823 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5dpwb_4f657495-2777-434f-9e5e-c076ada48605/global-pull-secret-syncer/0.log" Apr 16 14:30:01.299269 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:01.299236 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rnkfc_9fbd7ea5-6aaf-4f3c-a34c-596936befdcf/konnectivity-agent/0.log" Apr 16 14:30:01.422797 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:01.422718 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-244.ec2.internal_0ef025c6bf68cc466efdd3ff573ac22e/haproxy/0.log" Apr 16 14:30:05.401837 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:05.401805 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-wc2qz_d8bb742e-925e-4a94-8b9b-767c561d2f49/authorino/0.log" Apr 16 14:30:05.433830 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:05.433802 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-p4qgl_fe191a4e-dcc4-4632-b3df-df170611b410/manager/0.log" Apr 16 14:30:06.654736 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.654710 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5bb5adca-62c0-4503-a4ce-eac24a26da84/alertmanager/0.log" Apr 16 14:30:06.679124 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.679095 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5bb5adca-62c0-4503-a4ce-eac24a26da84/config-reloader/0.log" Apr 16 14:30:06.702040 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.702019 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5bb5adca-62c0-4503-a4ce-eac24a26da84/kube-rbac-proxy-web/0.log" Apr 16 14:30:06.727520 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.727502 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5bb5adca-62c0-4503-a4ce-eac24a26da84/kube-rbac-proxy/0.log" Apr 16 14:30:06.753858 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.753837 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5bb5adca-62c0-4503-a4ce-eac24a26da84/kube-rbac-proxy-metric/0.log" Apr 16 14:30:06.775467 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.775447 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5bb5adca-62c0-4503-a4ce-eac24a26da84/prom-label-proxy/0.log" Apr 16 14:30:06.798584 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.798561 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5bb5adca-62c0-4503-a4ce-eac24a26da84/init-config-reloader/0.log" Apr 16 14:30:06.873623 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.873542 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-2vv24_7df76db3-990c-4985-99c1-43587fe5e0c6/kube-state-metrics/0.log" Apr 16 14:30:06.894282 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.894259 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-2vv24_7df76db3-990c-4985-99c1-43587fe5e0c6/kube-rbac-proxy-main/0.log" Apr 16 14:30:06.917342 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.917316 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-2vv24_7df76db3-990c-4985-99c1-43587fe5e0c6/kube-rbac-proxy-self/0.log" Apr 16 14:30:06.942891 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.942870 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7bd8c88865-rst2t_7536256b-802a-4d61-bdb9-1dbff314c031/metrics-server/0.log" Apr 16 14:30:06.966565 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.966543 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-fnms7_ae97fff3-1d01-497a-b991-d35c0280495f/monitoring-plugin/0.log" Apr 16 14:30:06.993972 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:06.993952 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l8v8h_406c2eef-481c-4ba1-8a6d-dd2cf2607a8d/node-exporter/0.log" Apr 16 14:30:07.015848 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.015824 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l8v8h_406c2eef-481c-4ba1-8a6d-dd2cf2607a8d/kube-rbac-proxy/0.log" Apr 16 14:30:07.047649 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.047626 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l8v8h_406c2eef-481c-4ba1-8a6d-dd2cf2607a8d/init-textfile/0.log" Apr 16 14:30:07.334438 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.334412 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_14b8d935-5c9b-43fb-8e82-9ae3fa73f51a/prometheus/0.log" Apr 16 14:30:07.355758 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.355732 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_14b8d935-5c9b-43fb-8e82-9ae3fa73f51a/config-reloader/0.log" Apr 16 14:30:07.378626 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.378599 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_14b8d935-5c9b-43fb-8e82-9ae3fa73f51a/thanos-sidecar/0.log" Apr 16 14:30:07.405327 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.405213 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_14b8d935-5c9b-43fb-8e82-9ae3fa73f51a/kube-rbac-proxy-web/0.log" Apr 16 14:30:07.428391 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.428365 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_14b8d935-5c9b-43fb-8e82-9ae3fa73f51a/kube-rbac-proxy/0.log" Apr 16 14:30:07.453811 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.453786 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_14b8d935-5c9b-43fb-8e82-9ae3fa73f51a/kube-rbac-proxy-thanos/0.log" Apr 16 14:30:07.476290 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.476268 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_14b8d935-5c9b-43fb-8e82-9ae3fa73f51a/init-config-reloader/0.log" Apr 16 14:30:07.506056 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.506031 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-cvgtl_6bc5cd09-359e-4c1f-b044-6dda8378ff82/prometheus-operator/0.log" Apr 16 14:30:07.526720 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.526694 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-cvgtl_6bc5cd09-359e-4c1f-b044-6dda8378ff82/kube-rbac-proxy/0.log" Apr 16 14:30:07.556495 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.556456 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-mjv5m_bd0785b3-a6d7-4f47-a8ba-7668d4a2655d/prometheus-operator-admission-webhook/0.log" Apr 16 14:30:07.587787 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.587761 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b956596f6-2b254_6b883d92-fa45-47b1-a3aa-7690646c7936/telemeter-client/0.log" Apr 16 14:30:07.611159 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.611141 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b956596f6-2b254_6b883d92-fa45-47b1-a3aa-7690646c7936/reload/0.log" Apr 16 14:30:07.636554 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:07.636533 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b956596f6-2b254_6b883d92-fa45-47b1-a3aa-7690646c7936/kube-rbac-proxy/0.log" Apr 16 14:30:09.210944 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:09.210915 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-s2wmw_61cf3e35-c894-4e37-8617-ecc7b775e788/networking-console-plugin/0.log" Apr 16 14:30:10.372860 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.372827 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w"] Apr 16 14:30:10.373258 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.373195 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="tokenizer" Apr 16 14:30:10.373258 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.373224 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="tokenizer" Apr 16 14:30:10.373258 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.373242 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="storage-initializer" Apr 16 14:30:10.373258 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.373252 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="storage-initializer" Apr 16 14:30:10.373403 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.373274 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="main" Apr 16 14:30:10.373403 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.373280 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="main" Apr 16 14:30:10.373403 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.373351 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="tokenizer" Apr 16 14:30:10.373403 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.373363 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dd7ac30-7b36-46f6-9a02-bd9f4ae9946a" containerName="main" Apr 16 14:30:10.376854 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.376832 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.379375 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.379358 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8g7hp\"/\"kube-root-ca.crt\"" Apr 16 14:30:10.380348 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.380329 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8g7hp\"/\"openshift-service-ca.crt\"" Apr 16 14:30:10.380395 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.380371 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8g7hp\"/\"default-dockercfg-9xm4f\"" Apr 16 14:30:10.386149 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.386124 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w"] Apr 16 14:30:10.495792 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.495758 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-podres\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.495792 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.495795 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-sys\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.496077 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.495946 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-lib-modules\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.496077 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.496034 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbx27\" (UniqueName: \"kubernetes.io/projected/d5869d07-c201-43bc-bdb4-ad62594c793e-kube-api-access-mbx27\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.496077 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.496071 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-proc\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.597376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.597341 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbx27\" (UniqueName: \"kubernetes.io/projected/d5869d07-c201-43bc-bdb4-ad62594c793e-kube-api-access-mbx27\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.597376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.597383 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-proc\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.597628 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.597414 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-podres\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.597628 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.597432 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-sys\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.597628 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.597491 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-proc\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.597628 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.597505 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-lib-modules\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.597628 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.597549 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-podres\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.597628 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.597571 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-sys\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.597628 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.597593 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5869d07-c201-43bc-bdb4-ad62594c793e-lib-modules\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.606000 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.605972 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbx27\" (UniqueName: \"kubernetes.io/projected/d5869d07-c201-43bc-bdb4-ad62594c793e-kube-api-access-mbx27\") pod \"perf-node-gather-daemonset-5ck4w\" (UID: \"d5869d07-c201-43bc-bdb4-ad62594c793e\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.686968 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.686879 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:10.687335 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.687309 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-stk8w_2ef571a0-26fd-4b7c-a9c2-36cadb5a6ce3/volume-data-source-validator/0.log" Apr 16 14:30:10.810503 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.810462 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w"] Apr 16 14:30:10.813799 ip-10-0-140-244 kubenswrapper[2564]: W0416 14:30:10.813769 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd5869d07_c201_43bc_bdb4_ad62594c793e.slice/crio-de7bde4ccfa6b1d14ba74b482b2e88822da2f601e806cc083a1066fc5b35cd7a WatchSource:0}: Error finding container de7bde4ccfa6b1d14ba74b482b2e88822da2f601e806cc083a1066fc5b35cd7a: Status 404 returned error can't find the container with id de7bde4ccfa6b1d14ba74b482b2e88822da2f601e806cc083a1066fc5b35cd7a Apr 16 14:30:10.815583 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:10.815567 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:30:11.189393 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:11.189358 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" event={"ID":"d5869d07-c201-43bc-bdb4-ad62594c793e","Type":"ContainerStarted","Data":"4c62c7651e35b919dec0692ebfb06d122e6992d8363e6171983b16ba0a216507"} Apr 16 14:30:11.189600 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:11.189404 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" event={"ID":"d5869d07-c201-43bc-bdb4-ad62594c793e","Type":"ContainerStarted","Data":"de7bde4ccfa6b1d14ba74b482b2e88822da2f601e806cc083a1066fc5b35cd7a"} Apr 16 14:30:11.189600 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:11.189432 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:11.206911 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:11.206870 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" podStartSLOduration=1.206855697 podStartE2EDuration="1.206855697s" podCreationTimestamp="2026-04-16 14:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:11.205616471 +0000 UTC m=+1848.024981824" watchObservedRunningTime="2026-04-16 14:30:11.206855697 +0000 UTC m=+1848.026221046" Apr 16 14:30:11.446092 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:11.446013 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gmqgq_99722c79-e7f0-4687-85aa-06ca3ad840a2/dns/0.log" Apr 16 14:30:11.468057 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:11.468030 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gmqgq_99722c79-e7f0-4687-85aa-06ca3ad840a2/kube-rbac-proxy/0.log" Apr 16 14:30:11.700604 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:11.700526 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qzz48_69e2944f-6e8b-40c7-be64-0a3d77f6c3fd/dns-node-resolver/0.log" Apr 16 14:30:12.236127 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:12.236102 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m6p2z_1e3a02e8-43e4-4f89-a584-53fbf95d94cf/node-ca/0.log" Apr 16 14:30:13.174243 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:13.174218 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-4npv7_3bb5710b-5550-4097-9967-7df2e8c5b723/istio-proxy/0.log" Apr 16 14:30:13.200248 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:13.200221 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-54469996f6-6h8nf_776799ab-6fb3-4f6c-b80f-a5c4a9a7b6d1/router/0.log" Apr 16 14:30:13.696406 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:13.696377 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t7d2w_68dcca7e-5f78-4645-b5f2-2e8fe47395cc/serve-healthcheck-canary/0.log" Apr 16 14:30:14.289133 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:14.289101 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n7cgt_dc720c25-0498-47d4-ab0f-fd387e9f2072/kube-rbac-proxy/0.log" Apr 16 14:30:14.312372 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:14.312345 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n7cgt_dc720c25-0498-47d4-ab0f-fd387e9f2072/exporter/0.log" Apr 16 14:30:14.334628 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:14.334607 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n7cgt_dc720c25-0498-47d4-ab0f-fd387e9f2072/extractor/0.log" Apr 16 14:30:16.981089 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:16.981054 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5b8748f956-qntz4_80384fab-34a9-48a0-b240-612d519c2f86/manager/0.log" Apr 16 14:30:17.203242 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:17.203194 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-5ck4w" Apr 16 14:30:17.694867 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:17.694834 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-mjzd8_74fe564b-8b0e-4c62-9c21-414f01c31a52/server/0.log" Apr 16 14:30:17.883433 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:17.883405 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-t2r65_564df1a3-ceb0-4eae-a10f-03ab54f88c68/manager/0.log" Apr 16 14:30:17.936346 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:17.936320 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-cr7n8_165d3b42-50b6-4867-9354-74220d038316/seaweedfs/0.log" Apr 16 14:30:22.663567 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:22.663534 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-g66gt_980778ce-1a4a-40fb-ac6b-c6ce367030e2/migrator/0.log" Apr 16 14:30:22.684682 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:22.684648 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-g66gt_980778ce-1a4a-40fb-ac6b-c6ce367030e2/graceful-termination/0.log" Apr 16 14:30:24.041286 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.041244 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-926vk_38c85021-52e8-4534-bae8-801408d6b6f1/kube-multus/0.log" Apr 16 14:30:24.443385 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.443356 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mxtjl_7e8196db-0ad9-4936-a33e-c935a2815b53/kube-multus-additional-cni-plugins/0.log" Apr 16 14:30:24.474397 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.474367 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mxtjl_7e8196db-0ad9-4936-a33e-c935a2815b53/egress-router-binary-copy/0.log" Apr 16 14:30:24.506681 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.506653 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mxtjl_7e8196db-0ad9-4936-a33e-c935a2815b53/cni-plugins/0.log" Apr 16 14:30:24.527705 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.527675 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mxtjl_7e8196db-0ad9-4936-a33e-c935a2815b53/bond-cni-plugin/0.log" Apr 16 14:30:24.552621 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.552595 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mxtjl_7e8196db-0ad9-4936-a33e-c935a2815b53/routeoverride-cni/0.log" Apr 16 14:30:24.577036 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.577009 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mxtjl_7e8196db-0ad9-4936-a33e-c935a2815b53/whereabouts-cni-bincopy/0.log" Apr 16 14:30:24.599775 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.599752 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mxtjl_7e8196db-0ad9-4936-a33e-c935a2815b53/whereabouts-cni/0.log" Apr 16 14:30:24.702644 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.702568 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5g5kq_2bdb9ab9-1a72-487e-8b6c-732d544d0454/network-metrics-daemon/0.log" Apr 16 14:30:24.724376 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:24.724345 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5g5kq_2bdb9ab9-1a72-487e-8b6c-732d544d0454/kube-rbac-proxy/0.log" Apr 16 14:30:26.009273 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:26.009240 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cc429_b897edab-8b5c-4c47-bede-ddfcf288c0ea/ovn-controller/0.log" Apr 16 14:30:26.037694 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:26.037665 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cc429_b897edab-8b5c-4c47-bede-ddfcf288c0ea/ovn-acl-logging/0.log" Apr 16 14:30:26.058408 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:26.058382 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cc429_b897edab-8b5c-4c47-bede-ddfcf288c0ea/kube-rbac-proxy-node/0.log" Apr 16 14:30:26.080795 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:26.080768 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cc429_b897edab-8b5c-4c47-bede-ddfcf288c0ea/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:30:26.103300 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:26.103273 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cc429_b897edab-8b5c-4c47-bede-ddfcf288c0ea/northd/0.log" Apr 16 14:30:26.138779 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:26.138743 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cc429_b897edab-8b5c-4c47-bede-ddfcf288c0ea/nbdb/0.log" Apr 16 14:30:26.162766 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:26.162732 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cc429_b897edab-8b5c-4c47-bede-ddfcf288c0ea/sbdb/0.log" Apr 16 14:30:26.272839 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:26.272762 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cc429_b897edab-8b5c-4c47-bede-ddfcf288c0ea/ovnkube-controller/0.log" Apr 16 14:30:27.741132 ip-10-0-140-244 kubenswrapper[2564]: I0416 14:30:27.741100 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mmfcz_1ec67a7e-5f50-42d0-b878-f6ddc3826470/network-check-target-container/0.log"