Apr 28 19:13:48.585664 ip-10-0-138-34 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 28 19:13:48.585675 ip-10-0-138-34 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 28 19:13:48.585684 ip-10-0-138-34 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 28 19:13:48.585992 ip-10-0-138-34 systemd[1]: Failed to start Kubernetes Kubelet. Apr 28 19:13:58.806819 ip-10-0-138-34 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 28 19:13:58.806834 ip-10-0-138-34 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bc499a2bf33345e2a3da0b623fcf78df -- Apr 28 19:16:13.113906 ip-10-0-138-34 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:16:13.636966 ip-10-0-138-34 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:13.636966 ip-10-0-138-34 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:16:13.636966 ip-10-0-138-34 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:13.636966 ip-10-0-138-34 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:16:13.636966 ip-10-0-138-34 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:13.638097 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.637029 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:16:13.640855 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640829 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:13.640855 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640851 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:13.640855 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640856 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:13.640855 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640859 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:13.640855 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640862 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:13.640855 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640865 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640868 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640871 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640875 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640877 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640880 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640883 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640886 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640888 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640891 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640893 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640896 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640899 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640902 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640904 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640907 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640911 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640913 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640916 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640919 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:13.641091 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640921 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640924 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640926 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640929 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640933 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640939 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640943 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640946 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640948 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640951 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640954 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640957 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640960 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640962 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640966 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640968 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640971 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640974 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640976 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:13.641576 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640979 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640981 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640984 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640987 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640990 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640993 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640995 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.640998 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641000 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641002 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641005 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641007 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641010 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641012 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641025 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641027 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641030 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641033 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641035 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641038 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:13.642100 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641040 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641043 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641045 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641048 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641051 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641053 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641056 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641059 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641062 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641065 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641068 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641070 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641073 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641076 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641081 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641083 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641087 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641090 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641093 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:13.642595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641096 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641099 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641101 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641499 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641504 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641506 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641509 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641512 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641515 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641517 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641520 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641522 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641525 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641528 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641530 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641533 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641535 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641538 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641548 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:13.643090 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641553 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641557 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641560 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641562 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641565 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641567 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641570 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641573 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641577 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641580 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641582 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641585 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641588 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641590 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641593 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641595 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641598 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641615 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641619 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641622 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:13.643541 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641626 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641629 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641631 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641634 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641637 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641639 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641641 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641644 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641647 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641650 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641652 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641655 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641657 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641661 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641664 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641667 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641670 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641673 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641676 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641679 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:13.644059 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641682 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641685 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641688 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641690 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641693 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641695 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641698 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641701 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641703 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641706 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641708 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641711 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641713 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641716 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641718 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641721 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641724 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641727 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641730 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641733 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:13.644571 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641735 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641738 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641740 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641743 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641745 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641749 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641751 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641754 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641756 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.641759 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642690 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642701 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642708 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642713 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642718 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642721 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642727 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642731 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642734 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642737 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642741 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:16:13.645085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642744 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642747 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642750 2570 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642753 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642756 2570 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642759 2570 flags.go:64] FLAG: --cloud-config="" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642761 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642764 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642769 2570 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642772 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642775 2570 flags.go:64] FLAG: --config-dir="" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642778 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642781 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642785 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642789 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642792 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642796 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642799 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642802 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642805 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642808 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642811 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642816 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642820 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642823 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:16:13.645601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642825 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642829 2570 flags.go:64] FLAG: --enable-server="true" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642832 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642837 2570 flags.go:64] FLAG: --event-burst="100" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642840 2570 flags.go:64] FLAG: --event-qps="50" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642843 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642846 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642849 2570 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642853 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642856 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642859 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642862 2570 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642865 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642868 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642871 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642874 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642877 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642880 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642883 2570 flags.go:64] FLAG: --feature-gates="" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642887 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642890 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642893 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642896 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642899 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642902 2570 flags.go:64] FLAG: --help="false" Apr 28 19:16:13.646235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642905 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-138-34.ec2.internal" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642908 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642911 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642914 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642917 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642921 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642924 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642926 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642929 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642944 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642948 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642951 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642954 2570 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642957 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642960 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642963 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642966 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642969 2570 flags.go:64] FLAG: --lock-file="" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642972 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642975 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642978 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642987 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642991 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642994 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:16:13.646949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.642997 2570 flags.go:64] FLAG: --logging-format="text" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643000 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643003 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643006 2570 flags.go:64] FLAG: --manifest-url="" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643009 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643013 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643016 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643021 2570 flags.go:64] FLAG: --max-pods="110" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643023 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643026 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643029 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643032 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643035 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643038 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643041 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643049 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643052 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643055 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643059 2570 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643062 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643068 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643071 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643076 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643079 2570 flags.go:64] FLAG: --port="10250" Apr 28 19:16:13.647521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643083 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643085 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02596becaa0f04447" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643089 2570 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643091 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643095 2570 flags.go:64] FLAG: --register-node="true" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643098 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643101 2570 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643104 2570 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643107 2570 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643110 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643113 2570 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643117 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643120 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643123 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643126 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643129 2570 flags.go:64] FLAG: --runonce="false" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643132 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643135 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643137 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643140 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643143 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643146 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643149 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643152 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643158 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643161 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:16:13.648128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643164 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643167 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643170 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643173 2570 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643177 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643182 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643185 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643188 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643192 2570 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643195 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643198 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643201 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643204 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643207 2570 flags.go:64] FLAG: --v="2" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643211 2570 flags.go:64] FLAG: --version="false" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643215 2570 flags.go:64] FLAG: --vmodule="" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643219 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.643222 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643325 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643329 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643332 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643336 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643339 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:13.648766 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643342 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643346 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643350 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643353 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643356 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643359 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643362 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643366 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643369 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643372 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643375 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643378 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643380 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643384 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643387 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643390 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643392 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643395 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643398 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:13.649315 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643400 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643403 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643405 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643408 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643410 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643413 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643415 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643418 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643420 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643423 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643425 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643428 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643431 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643433 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643436 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643438 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643441 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643443 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643446 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643448 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:13.649871 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643452 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643455 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643459 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643462 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643464 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643467 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643471 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643474 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643477 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643479 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643481 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643484 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643487 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643489 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643492 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643494 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643497 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643499 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643502 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643504 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:13.650370 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643507 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643510 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643512 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643515 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643517 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643520 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643523 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643525 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643528 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643530 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643533 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643535 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643539 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643542 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643545 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643548 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643550 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643553 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643560 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643563 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:13.650882 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643566 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:13.651383 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.643568 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:13.651383 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.644133 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:13.651536 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.651408 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:16:13.651564 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.651539 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651591 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651600 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651618 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651623 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651626 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651629 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651633 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651636 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651638 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651642 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:13.651600 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651645 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651648 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651651 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651654 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651657 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651660 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651663 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651665 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651668 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651672 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651674 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651677 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651680 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651682 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651685 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651688 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651690 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651693 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651696 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651698 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:13.651949 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651701 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651705 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651708 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651710 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651713 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651715 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651718 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651721 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651723 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651726 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651728 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651731 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651733 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651736 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651739 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651742 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651746 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651751 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651754 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:13.652428 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651757 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651759 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651762 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651764 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651767 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651770 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651773 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651775 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651777 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651780 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651783 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651785 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651788 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651790 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651793 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651796 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651799 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651801 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651804 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:13.652917 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651808 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651812 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651815 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651817 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651820 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651822 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651825 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651827 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651830 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651833 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651836 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651838 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651841 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651844 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651847 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651849 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651852 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:13.653371 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651854 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.651859 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651963 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651968 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651971 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651974 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651977 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651980 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651983 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651985 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651988 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651991 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651994 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651997 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.651999 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652002 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:13.653842 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652005 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652007 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652009 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652012 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652015 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652018 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652020 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652023 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652026 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652029 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652032 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652034 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652037 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652040 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652042 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652045 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652048 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652051 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652053 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:13.654245 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652056 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652058 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652060 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652063 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652065 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652068 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652070 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652073 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652075 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652078 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652081 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652084 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652086 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652089 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652091 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652093 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652096 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652098 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652101 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652103 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:13.654734 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652106 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652109 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652112 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652115 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652117 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652120 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652122 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652125 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652127 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652130 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652133 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652136 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652140 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652143 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652146 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652148 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652151 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652154 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652156 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:13.655219 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652159 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652162 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652164 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652167 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652170 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652172 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652174 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652177 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652179 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652183 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652186 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652189 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652192 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:13.652195 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.652201 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:13.655801 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.652337 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:16:13.656164 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.655917 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:16:13.656996 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.656982 2570 server.go:1019] "Starting client certificate rotation" Apr 28 19:16:13.657095 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.657079 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:13.657131 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.657114 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:13.696298 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.696276 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:13.699186 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.699172 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:13.720476 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.720420 2570 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:16:13.726665 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.726643 2570 log.go:25] "Validated CRI v1 image API" Apr 28 19:16:13.727883 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.727868 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:16:13.728534 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.728517 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:13.730627 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.730592 2570 fs.go:135] Filesystem UUIDs: map[0ab034e2-793b-476b-9f09-14abb69c4dd1:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 83ae695b-3f61-4aaa-901f-51c1731b8825:/dev/nvme0n1p4] Apr 28 19:16:13.730696 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.730626 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:16:13.736434 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.736322 2570 manager.go:217] Machine: {Timestamp:2026-04-28 19:16:13.734361957 +0000 UTC m=+0.485911463 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3039202 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22ca43626a8f217b09aca293f65687 SystemUUID:ec22ca43-626a-8f21-7b09-aca293f65687 BootID:bc499a2b-f333-45e2-a3da-0b623fcf78df Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0f:3c:6e:7b:93 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0f:3c:6e:7b:93 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:8b:c1:8f:cd:13 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:16:13.736434 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.736427 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:16:13.736545 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.736518 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:16:13.741547 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.741509 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:16:13.741711 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.741551 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-34.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:16:13.741766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.741720 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:16:13.741766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.741728 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:16:13.741766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.741746 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:13.742479 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.742468 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:13.743900 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.743888 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:13.744013 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.744005 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:16:13.746948 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.746938 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:16:13.746996 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.746959 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:16:13.746996 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.746971 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:16:13.747058 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.747001 2570 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:16:13.747058 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.747021 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:16:13.749131 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.749114 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:13.749182 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.749142 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:13.752756 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.752738 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:16:13.754401 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.754384 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:16:13.756078 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756064 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:16:13.756078 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756081 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756087 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756095 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756101 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756108 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756116 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756124 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756132 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756138 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756155 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:16:13.756175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.756165 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:16:13.757067 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.757057 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:16:13.757104 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.757069 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:16:13.760585 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.760534 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:16:13.760585 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.760570 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-34.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:16:13.760740 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.760658 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-34.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:16:13.761170 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.761152 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:16:13.761259 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.761208 2570 server.go:1295] "Started kubelet" Apr 28 19:16:13.762026 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.761989 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:16:13.762116 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.762062 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:16:13.762169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.762132 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:16:13.762245 ip-10-0-138-34 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:16:13.763423 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.763399 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:16:13.765915 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.765895 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:16:13.770377 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.770360 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:16:13.770474 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.770378 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:13.770741 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.769766 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-34.ec2.internal.18aa9b50ad792606 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-34.ec2.internal,UID:ip-10-0-138-34.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-34.ec2.internal,},FirstTimestamp:2026-04-28 19:16:13.761168902 +0000 UTC m=+0.512718419,LastTimestamp:2026-04-28 19:16:13.761168902 +0000 UTC m=+0.512718419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-34.ec2.internal,}" Apr 28 19:16:13.771116 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771095 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:16:13.771352 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771335 2570 factory.go:55] Registering systemd factory Apr 28 19:16:13.771433 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771359 2570 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:16:13.771490 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.771438 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:13.771619 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771593 2570 factory.go:153] Registering CRI-O factory Apr 28 19:16:13.771679 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771625 2570 factory.go:223] Registration of the crio container factory successfully Apr 28 19:16:13.771679 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771645 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:16:13.771679 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771660 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:16:13.771679 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771673 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:16:13.771866 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771699 2570 factory.go:103] Registering Raw factory Apr 28 19:16:13.771866 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771714 2570 manager.go:1196] Started watching for new ooms in manager Apr 28 19:16:13.771866 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771761 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:16:13.771866 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.771772 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:16:13.772155 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.772141 2570 manager.go:319] Starting recovery of all containers Apr 28 19:16:13.774386 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.774362 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:16:13.778678 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.778658 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ltlml" Apr 28 19:16:13.780366 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.780336 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 28 19:16:13.780453 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.780400 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-34.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 28 19:16:13.782758 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.782735 2570 manager.go:324] Recovery completed Apr 28 19:16:13.786869 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.786850 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ltlml" Apr 28 19:16:13.787575 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.787563 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:13.790061 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.790043 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:13.790141 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.790073 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:13.790141 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.790104 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:13.790703 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.790689 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:16:13.790703 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.790703 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:16:13.790833 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.790721 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:13.793266 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.793254 2570 policy_none.go:49] "None policy: Start" Apr 28 19:16:13.793317 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.793269 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:16:13.793317 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.793279 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:16:13.846039 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.846023 2570 manager.go:341] "Starting Device Plugin manager" Apr 28 19:16:13.846164 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.846095 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:16:13.846164 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.846105 2570 server.go:85] "Starting device plugin registration server" Apr 28 19:16:13.846344 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.846333 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:16:13.846381 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.846344 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:16:13.846466 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.846444 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:16:13.846564 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.846544 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:16:13.846564 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.846551 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:16:13.847360 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.847342 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:16:13.847430 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.847384 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:13.881326 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.881297 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:16:13.882511 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.882495 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:16:13.882570 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.882529 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:16:13.882570 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.882548 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:16:13.882570 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.882556 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:16:13.882734 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.882590 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:16:13.885330 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.885314 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:13.946723 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.946664 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:13.947973 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.947956 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:13.948060 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.947984 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:13.948060 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.947995 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:13.948060 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.948018 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-34.ec2.internal" Apr 28 19:16:13.957503 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.957485 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-34.ec2.internal" Apr 28 19:16:13.957601 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.957510 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-34.ec2.internal\": node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:13.978636 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:13.978598 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:13.983444 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.983417 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal"] Apr 28 19:16:13.983503 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.983487 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:13.984275 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.984261 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:13.984341 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.984288 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:13.984341 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.984299 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:13.986406 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.986393 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:13.986551 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.986538 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" Apr 28 19:16:13.986586 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.986567 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:13.987096 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.987071 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:13.987175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.987104 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:13.987175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.987075 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:13.987175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.987139 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:13.987175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.987153 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:13.987175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.987115 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:13.989127 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.989113 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal" Apr 28 19:16:13.989174 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.989137 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:13.989802 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.989783 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:13.989899 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.989806 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:13.989899 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:13.989819 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:14.010596 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:14.010562 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-34.ec2.internal\" not found" node="ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.015074 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:14.015052 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-34.ec2.internal\" not found" node="ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.074055 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.074027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fe6a7892c477bb0d3d9ed823ee3c3a21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal\" (UID: \"fe6a7892c477bb0d3d9ed823ee3c3a21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.074055 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.074058 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe6a7892c477bb0d3d9ed823ee3c3a21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal\" (UID: \"fe6a7892c477bb0d3d9ed823ee3c3a21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.074206 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.074076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b985e5bf011f3a2cbd34f60280660243-config\") pod \"kube-apiserver-proxy-ip-10-0-138-34.ec2.internal\" (UID: \"b985e5bf011f3a2cbd34f60280660243\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.079130 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:14.079102 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:14.174451 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.174422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe6a7892c477bb0d3d9ed823ee3c3a21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal\" (UID: \"fe6a7892c477bb0d3d9ed823ee3c3a21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.174451 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.174369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe6a7892c477bb0d3d9ed823ee3c3a21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal\" (UID: \"fe6a7892c477bb0d3d9ed823ee3c3a21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.174657 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.174479 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b985e5bf011f3a2cbd34f60280660243-config\") pod \"kube-apiserver-proxy-ip-10-0-138-34.ec2.internal\" (UID: \"b985e5bf011f3a2cbd34f60280660243\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.174657 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.174496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fe6a7892c477bb0d3d9ed823ee3c3a21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal\" (UID: \"fe6a7892c477bb0d3d9ed823ee3c3a21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.174657 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.174521 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fe6a7892c477bb0d3d9ed823ee3c3a21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal\" (UID: \"fe6a7892c477bb0d3d9ed823ee3c3a21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.174657 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.174565 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b985e5bf011f3a2cbd34f60280660243-config\") pod \"kube-apiserver-proxy-ip-10-0-138-34.ec2.internal\" (UID: \"b985e5bf011f3a2cbd34f60280660243\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.179460 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:14.179442 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:14.280365 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:14.280338 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:14.312548 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.312515 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.317356 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.317340 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.381327 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:14.381293 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:14.481794 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:14.481762 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:14.582397 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:14.582323 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:14.656636 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.656589 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:16:14.657255 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.656766 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:14.683144 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:14.683112 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-34.ec2.internal\" not found" Apr 28 19:16:14.771025 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.770998 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:14.776357 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.776341 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:14.787760 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.787739 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:14.788843 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.788821 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:11:13 +0000 UTC" deadline="2027-12-05 09:28:00.571611838 +0000 UTC" Apr 28 19:16:14.788899 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.788843 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14054h11m45.782771812s" Apr 28 19:16:14.800908 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.800889 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:14.820579 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.820560 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4j98p" Apr 28 19:16:14.829783 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.829767 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4j98p" Apr 28 19:16:14.871284 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.871255 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.892188 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.892169 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:14.893361 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.893347 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal" Apr 28 19:16:14.918349 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:14.918321 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:15.029430 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:15.029252 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb985e5bf011f3a2cbd34f60280660243.slice/crio-abbc3bc2da22503ea560376d404a1c4a890640a23ce2f38ed07dd39e25734e37 WatchSource:0}: Error finding container abbc3bc2da22503ea560376d404a1c4a890640a23ce2f38ed07dd39e25734e37: Status 404 returned error can't find the container with id abbc3bc2da22503ea560376d404a1c4a890640a23ce2f38ed07dd39e25734e37 Apr 28 19:16:15.029674 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:15.029650 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe6a7892c477bb0d3d9ed823ee3c3a21.slice/crio-ec17f1054828814b5cd7fc5b1a6ed1ba3509364aada76a52f484d395659d5ca3 WatchSource:0}: Error finding container ec17f1054828814b5cd7fc5b1a6ed1ba3509364aada76a52f484d395659d5ca3: Status 404 returned error can't find the container with id ec17f1054828814b5cd7fc5b1a6ed1ba3509364aada76a52f484d395659d5ca3 Apr 28 19:16:15.033364 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.033349 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:16:15.070216 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.070187 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:15.747866 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.747675 2570 apiserver.go:52] "Watching apiserver" Apr 28 19:16:15.755250 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.755224 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:16:15.755772 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.755747 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal","openshift-image-registry/node-ca-nt8t9","openshift-multus/multus-additional-cni-plugins-g9c9n","openshift-network-diagnostics/network-check-target-nc646","openshift-ovn-kubernetes/ovnkube-node-7xms2","kube-system/konnectivity-agent-prm99","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw","openshift-cluster-node-tuning-operator/tuned-hkxr5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal","openshift-multus/multus-pppcc","openshift-multus/network-metrics-daemon-qgcjb","openshift-network-operator/iptables-alerter-c8f8g"] Apr 28 19:16:15.758910 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.758888 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.761552 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.761524 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:15.761667 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:15.761632 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:15.763929 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.763907 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.765158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.765139 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:16:15.765158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.765162 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:16:15.765364 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.765281 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:16:15.765443 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.765426 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lqs9h\"" Apr 28 19:16:15.765498 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.765457 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.765641 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.765601 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.766021 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.766003 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:15.766118 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:15.766067 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:15.767214 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.767190 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:16:15.768362 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.767391 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.768362 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.767456 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.768362 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.767487 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-plvh4\"" Apr 28 19:16:15.768626 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.768597 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.772672 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.770854 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:15.772672 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.772340 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4xdc4\"" Apr 28 19:16:15.772672 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.772477 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.772672 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.772568 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:16:15.772903 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.772723 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:16:15.772903 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.772772 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:16:15.773600 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.773562 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.773760 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.773738 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.776166 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.776144 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wgwbw\"" Apr 28 19:16:15.776509 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.776490 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:16:15.776843 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.776826 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:16:15.777072 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.777051 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:16:15.779724 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.779454 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.779724 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.779485 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:16:15.779724 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.779513 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.779931 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.779744 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-m9bsx\"" Apr 28 19:16:15.782066 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782050 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.782270 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782247 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-device-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.782351 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782288 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqjjh\" (UniqueName: \"kubernetes.io/projected/15eb3509-1e81-4a17-b03f-275a88d26015-kube-api-access-fqjjh\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.782351 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782315 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6knkp\" (UniqueName: \"kubernetes.io/projected/0b961ce3-ed85-40f4-840c-df0e74d830dd-kube-api-access-6knkp\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:15.782351 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782339 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/04deb9df-0bdd-4917-a722-41b25477b851-konnectivity-ca\") pod \"konnectivity-agent-prm99\" (UID: \"04deb9df-0bdd-4917-a722-41b25477b851\") " pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:15.782514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782363 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dtz2\" (UniqueName: \"kubernetes.io/projected/becc1ca0-0b4e-43d1-95d2-8979c6dd35fd-kube-api-access-2dtz2\") pod \"node-ca-nt8t9\" (UID: \"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd\") " pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.782514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782387 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-cnibin\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.782514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782134 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.782514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782429 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-run-systemd\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.782514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782495 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-run-ovn\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782526 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-node-log\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-log-socket\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782592 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-system-cni-dir\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782639 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1d508319-bdc2-4029-88e0-8b8406c4ac0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782665 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gwd\" (UniqueName: \"kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd\") pod \"network-check-target-nc646\" (UID: \"633bd943-3978-4baf-be3b-c82a70d85512\") " pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782689 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becc1ca0-0b4e-43d1-95d2-8979c6dd35fd-serviceca\") pod \"node-ca-nt8t9\" (UID: \"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd\") " pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782712 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-kubelet\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782741 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-slash\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782772 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.782839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782814 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-registration-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-etc-selinux\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782869 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7bz2\" (UniqueName: \"kubernetes.io/projected/1d508319-bdc2-4029-88e0-8b8406c4ac0b-kube-api-access-k7bz2\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782904 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-var-lib-openvswitch\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.782986 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-etc-openvswitch\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783021 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-run-openvswitch\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783042 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21172191-de03-4932-85fe-40437ea0c56a-ovnkube-config\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783058 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21172191-de03-4932-85fe-40437ea0c56a-ovn-node-metrics-cert\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783107 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783152 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-run-netns\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783189 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-os-release\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d508319-bdc2-4029-88e0-8b8406c4ac0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783258 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/04deb9df-0bdd-4917-a722-41b25477b851-agent-certs\") pod \"konnectivity-agent-prm99\" (UID: \"04deb9df-0bdd-4917-a722-41b25477b851\") " pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:15.783303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783293 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becc1ca0-0b4e-43d1-95d2-8979c6dd35fd-host\") pod \"node-ca-nt8t9\" (UID: \"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd\") " pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783315 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21172191-de03-4932-85fe-40437ea0c56a-ovnkube-script-lib\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783338 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25rbc\" (UniqueName: \"kubernetes.io/projected/21172191-de03-4932-85fe-40437ea0c56a-kube-api-access-25rbc\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783360 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-socket-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783390 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-sys-fs\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783414 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783441 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-cni-netd\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783465 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783489 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21172191-de03-4932-85fe-40437ea0c56a-env-overrides\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783513 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d508319-bdc2-4029-88e0-8b8406c4ac0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783545 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-systemd-units\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.783851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.783576 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-cni-bin\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.784426 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.784398 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:15.785032 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.784692 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pjj6k\"" Apr 28 19:16:15.785032 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.784908 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jc7mx\"" Apr 28 19:16:15.785491 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.785254 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.785491 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.785330 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:16:15.785491 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.785477 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.787741 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.787721 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:15.788750 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.788731 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5nmfb\"" Apr 28 19:16:15.789049 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.789036 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:16:15.789245 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.789232 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:15.832277 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.832241 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:14 +0000 UTC" deadline="2028-01-24 03:58:34.6129136 +0000 UTC" Apr 28 19:16:15.832277 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.832275 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15248h42m18.780642209s" Apr 28 19:16:15.872013 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.871984 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:16:15.884481 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884430 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:15.884674 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21172191-de03-4932-85fe-40437ea0c56a-ovn-node-metrics-cert\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.884674 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d508319-bdc2-4029-88e0-8b8406c4ac0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.884674 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884542 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-run\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.884674 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884571 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-var-lib-cni-multus\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.884674 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884627 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-sysctl-conf\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.884674 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884652 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-var-lib-kubelet\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.884674 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884673 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-conf-dir\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884696 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-run-multus-certs\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884718 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-etc-kubernetes\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884748 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becc1ca0-0b4e-43d1-95d2-8979c6dd35fd-host\") pod \"node-ca-nt8t9\" (UID: \"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd\") " pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884765 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rbc\" (UniqueName: \"kubernetes.io/projected/21172191-de03-4932-85fe-40437ea0c56a-kube-api-access-25rbc\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884781 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-sys-fs\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884804 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-var-lib-kubelet\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884819 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-run-k8s-cni-cncf-io\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884832 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-var-lib-cni-bin\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-cni-netd\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.885038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.884933 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21172191-de03-4932-85fe-40437ea0c56a-env-overrides\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.885474 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885411 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e61cde97-43d8-438a-abf3-9be15c9fb8d0-iptables-alerter-script\") pod \"iptables-alerter-c8f8g\" (UID: \"e61cde97-43d8-438a-abf3-9be15c9fb8d0\") " pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:15.885528 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885475 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-sysctl-d\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.885528 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g74q\" (UniqueName: \"kubernetes.io/projected/106d173d-f66c-4772-a2c9-07f5c1dc8219-kube-api-access-5g74q\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.885656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885557 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e61cde97-43d8-438a-abf3-9be15c9fb8d0-host-slash\") pod \"iptables-alerter-c8f8g\" (UID: \"e61cde97-43d8-438a-abf3-9be15c9fb8d0\") " pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:15.885656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885619 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/04deb9df-0bdd-4917-a722-41b25477b851-konnectivity-ca\") pod \"konnectivity-agent-prm99\" (UID: \"04deb9df-0bdd-4917-a722-41b25477b851\") " pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:15.885751 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885664 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dtz2\" (UniqueName: \"kubernetes.io/projected/becc1ca0-0b4e-43d1-95d2-8979c6dd35fd-kube-api-access-2dtz2\") pod \"node-ca-nt8t9\" (UID: \"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd\") " pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.885751 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885700 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-host\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.885751 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-sysconfig\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.885893 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885777 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-cnibin\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.885893 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885851 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-os-release\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.885979 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885892 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-socket-dir-parent\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.886036 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.885985 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becc1ca0-0b4e-43d1-95d2-8979c6dd35fd-host\") pod \"node-ca-nt8t9\" (UID: \"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd\") " pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.886036 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.886005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-cni-netd\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.886259 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.886228 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-sys-fs\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.887289 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.886446 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/04deb9df-0bdd-4917-a722-41b25477b851-konnectivity-ca\") pod \"konnectivity-agent-prm99\" (UID: \"04deb9df-0bdd-4917-a722-41b25477b851\") " pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:15.887289 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.886497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.887289 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.886829 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcnt7\" (UniqueName: \"kubernetes.io/projected/e61cde97-43d8-438a-abf3-9be15c9fb8d0-kube-api-access-jcnt7\") pod \"iptables-alerter-c8f8g\" (UID: \"e61cde97-43d8-438a-abf3-9be15c9fb8d0\") " pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:15.887289 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887106 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21172191-de03-4932-85fe-40437ea0c56a-env-overrides\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.887289 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887136 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:16:15.887577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887309 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-modprobe-d\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.887577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887353 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26gwd\" (UniqueName: \"kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd\") pod \"network-check-target-nc646\" (UID: \"633bd943-3978-4baf-be3b-c82a70d85512\") " pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:15.887577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becc1ca0-0b4e-43d1-95d2-8979c6dd35fd-serviceca\") pod \"node-ca-nt8t9\" (UID: \"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd\") " pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.887577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-kubelet\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.887577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887444 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-slash\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.887577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887479 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.887577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7bz2\" (UniqueName: \"kubernetes.io/projected/1d508319-bdc2-4029-88e0-8b8406c4ac0b-kube-api-access-k7bz2\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.887577 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:15.887516 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:15.887577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887545 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-var-lib-openvswitch\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.887988 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887626 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-etc-openvswitch\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.887988 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-run-openvswitch\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.887988 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887702 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d508319-bdc2-4029-88e0-8b8406c4ac0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.887988 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887708 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21172191-de03-4932-85fe-40437ea0c56a-ovnkube-config\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.887988 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887768 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.887988 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-os-release\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.887988 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887919 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-os-release\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.887988 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887970 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-hostroot\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.887996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-run-netns\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-systemd\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888034 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7dgz\" (UniqueName: \"kubernetes.io/projected/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-kube-api-access-r7dgz\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888057 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/04deb9df-0bdd-4917-a722-41b25477b851-agent-certs\") pod \"konnectivity-agent-prm99\" (UID: \"04deb9df-0bdd-4917-a722-41b25477b851\") " pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888083 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21172191-de03-4932-85fe-40437ea0c56a-ovnkube-script-lib\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888112 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-socket-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-kubernetes\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888211 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21172191-de03-4932-85fe-40437ea0c56a-ovnkube-config\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888219 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-socket-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888233 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d508319-bdc2-4029-88e0-8b8406c4ac0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888262 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-system-cni-dir\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888291 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/106d173d-f66c-4772-a2c9-07f5c1dc8219-cni-binary-copy\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.888325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.888331 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-run-netns\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.888998 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:15.888515 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs podName:0b961ce3-ed85-40f4-840c-df0e74d830dd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.388472681 +0000 UTC m=+3.140022173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs") pod "network-metrics-daemon-qgcjb" (UID: "0b961ce3-ed85-40f4-840c-df0e74d830dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:15.889162 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889132 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d508319-bdc2-4029-88e0-8b8406c4ac0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.889162 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889152 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becc1ca0-0b4e-43d1-95d2-8979c6dd35fd-serviceca\") pod \"node-ca-nt8t9\" (UID: \"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd\") " pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.889303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.889303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889293 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-kubelet\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.889363 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889345 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-slash\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.889363 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889302 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal" event={"ID":"b985e5bf011f3a2cbd34f60280660243","Type":"ContainerStarted","Data":"abbc3bc2da22503ea560376d404a1c4a890640a23ce2f38ed07dd39e25734e37"} Apr 28 19:16:15.889421 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889403 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-var-lib-openvswitch\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.889421 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-daemon-config\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.889481 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889447 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-etc-openvswitch\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.889481 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889449 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-systemd-units\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.889577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889505 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-systemd-units\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.889577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-run-openvswitch\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.889695 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889577 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-cni-bin\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.889695 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889654 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-device-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.889790 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889751 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqjjh\" (UniqueName: \"kubernetes.io/projected/15eb3509-1e81-4a17-b03f-275a88d26015-kube-api-access-fqjjh\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.889839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889810 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-host-cni-bin\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.889890 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889855 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-device-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.889944 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889892 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-tmp\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.889944 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6knkp\" (UniqueName: \"kubernetes.io/projected/0b961ce3-ed85-40f4-840c-df0e74d830dd-kube-api-access-6knkp\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:15.890038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.889989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-cnibin\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.890038 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890023 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-sys\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.890128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-lib-modules\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.890128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890102 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-run-netns\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.890287 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890260 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21172191-de03-4932-85fe-40437ea0c56a-ovnkube-script-lib\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.890338 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-run-systemd\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.890388 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890339 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-run-ovn\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.890388 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890353 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-cnibin\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.890388 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890366 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-node-log\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.890504 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890418 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-node-log\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.890504 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890439 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-run-systemd\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.890504 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890475 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-log-socket\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.890504 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890474 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-run-ovn\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.890656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890525 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-system-cni-dir\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.890656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890541 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21172191-de03-4932-85fe-40437ea0c56a-log-socket\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.890656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890553 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1d508319-bdc2-4029-88e0-8b8406c4ac0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.890656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890592 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d508319-bdc2-4029-88e0-8b8406c4ac0b-system-cni-dir\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.890656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890622 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-registration-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.890890 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890682 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-registration-dir\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.890890 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890745 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-etc-selinux\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.890890 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890777 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-tuned\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.890890 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890817 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-cni-dir\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.891153 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.890946 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/15eb3509-1e81-4a17-b03f-275a88d26015-etc-selinux\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.891153 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.891078 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1d508319-bdc2-4029-88e0-8b8406c4ac0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.892784 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.892724 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:15.893490 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.893425 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21172191-de03-4932-85fe-40437ea0c56a-ovn-node-metrics-cert\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.894173 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.894124 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" event={"ID":"fe6a7892c477bb0d3d9ed823ee3c3a21","Type":"ContainerStarted","Data":"ec17f1054828814b5cd7fc5b1a6ed1ba3509364aada76a52f484d395659d5ca3"} Apr 28 19:16:15.895392 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.895370 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/04deb9df-0bdd-4917-a722-41b25477b851-agent-certs\") pod \"konnectivity-agent-prm99\" (UID: \"04deb9df-0bdd-4917-a722-41b25477b851\") " pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:15.903553 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.903311 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25rbc\" (UniqueName: \"kubernetes.io/projected/21172191-de03-4932-85fe-40437ea0c56a-kube-api-access-25rbc\") pod \"ovnkube-node-7xms2\" (UID: \"21172191-de03-4932-85fe-40437ea0c56a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:15.907711 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.907690 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dtz2\" (UniqueName: \"kubernetes.io/projected/becc1ca0-0b4e-43d1-95d2-8979c6dd35fd-kube-api-access-2dtz2\") pod \"node-ca-nt8t9\" (UID: \"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd\") " pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:15.908236 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.908213 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6knkp\" (UniqueName: \"kubernetes.io/projected/0b961ce3-ed85-40f4-840c-df0e74d830dd-kube-api-access-6knkp\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:15.909436 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:15.909415 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:15.909517 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:15.909441 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:15.909517 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:15.909454 2570 projected.go:194] Error preparing data for projected volume kube-api-access-26gwd for pod openshift-network-diagnostics/network-check-target-nc646: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:15.909665 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:15.909516 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd podName:633bd943-3978-4baf-be3b-c82a70d85512 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.409498432 +0000 UTC m=+3.161047947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-26gwd" (UniqueName: "kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd") pod "network-check-target-nc646" (UID: "633bd943-3978-4baf-be3b-c82a70d85512") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:15.911729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.911704 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqjjh\" (UniqueName: \"kubernetes.io/projected/15eb3509-1e81-4a17-b03f-275a88d26015-kube-api-access-fqjjh\") pod \"aws-ebs-csi-driver-node-xkkkw\" (UID: \"15eb3509-1e81-4a17-b03f-275a88d26015\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:15.911872 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.911852 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7bz2\" (UniqueName: \"kubernetes.io/projected/1d508319-bdc2-4029-88e0-8b8406c4ac0b-kube-api-access-k7bz2\") pod \"multus-additional-cni-plugins-g9c9n\" (UID: \"1d508319-bdc2-4029-88e0-8b8406c4ac0b\") " pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:15.991131 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991094 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/106d173d-f66c-4772-a2c9-07f5c1dc8219-cni-binary-copy\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.991302 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991138 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-daemon-config\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.991302 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-tmp\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.991729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991703 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/106d173d-f66c-4772-a2c9-07f5c1dc8219-cni-binary-copy\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.991837 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991749 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-sys\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.991837 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-lib-modules\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.991837 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991758 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-daemon-config\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.991837 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991817 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-sys\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.992031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991857 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-run-netns\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-tuned\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.992031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991917 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-cni-dir\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-run-netns\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991935 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-lib-modules\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.992031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991958 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-run\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.992031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.991983 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-var-lib-cni-multus\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-cni-dir\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-sysctl-conf\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992057 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-var-lib-kubelet\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992066 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-run\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-conf-dir\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992108 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-run-multus-certs\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-var-lib-kubelet\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992134 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-etc-kubernetes\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992139 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-sysctl-conf\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992160 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-var-lib-cni-multus\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992161 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-var-lib-kubelet\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-run-multus-certs\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992203 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-run-k8s-cni-cncf-io\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992226 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-conf-dir\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-var-lib-cni-bin\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992257 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-var-lib-kubelet\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992272 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-var-lib-cni-bin\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992273 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e61cde97-43d8-438a-abf3-9be15c9fb8d0-iptables-alerter-script\") pod \"iptables-alerter-c8f8g\" (UID: \"e61cde97-43d8-438a-abf3-9be15c9fb8d0\") " pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992308 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-host-run-k8s-cni-cncf-io\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.992459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992311 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-sysctl-d\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g74q\" (UniqueName: \"kubernetes.io/projected/106d173d-f66c-4772-a2c9-07f5c1dc8219-kube-api-access-5g74q\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992373 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e61cde97-43d8-438a-abf3-9be15c9fb8d0-host-slash\") pod \"iptables-alerter-c8f8g\" (UID: \"e61cde97-43d8-438a-abf3-9be15c9fb8d0\") " pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992396 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-host\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992419 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-sysconfig\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992438 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-sysctl-d\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-cnibin\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-os-release\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-socket-dir-parent\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992484 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e61cde97-43d8-438a-abf3-9be15c9fb8d0-host-slash\") pod \"iptables-alerter-c8f8g\" (UID: \"e61cde97-43d8-438a-abf3-9be15c9fb8d0\") " pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992525 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcnt7\" (UniqueName: \"kubernetes.io/projected/e61cde97-43d8-438a-abf3-9be15c9fb8d0-kube-api-access-jcnt7\") pod \"iptables-alerter-c8f8g\" (UID: \"e61cde97-43d8-438a-abf3-9be15c9fb8d0\") " pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992203 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-etc-kubernetes\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992553 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-modprobe-d\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992556 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-multus-socket-dir-parent\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992630 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-host\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992632 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-modprobe-d\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992650 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-hostroot\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992681 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-sysconfig\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993208 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992685 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-systemd\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992714 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7dgz\" (UniqueName: \"kubernetes.io/projected/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-kube-api-access-r7dgz\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992745 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-os-release\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-kubernetes\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992760 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e61cde97-43d8-438a-abf3-9be15c9fb8d0-iptables-alerter-script\") pod \"iptables-alerter-c8f8g\" (UID: \"e61cde97-43d8-438a-abf3-9be15c9fb8d0\") " pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992786 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-system-cni-dir\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-kubernetes\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-systemd\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992864 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-system-cni-dir\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992867 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-hostroot\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.993730 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.992909 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/106d173d-f66c-4772-a2c9-07f5c1dc8219-cnibin\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:15.994160 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.994137 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-etc-tuned\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:15.994270 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:15.994184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-tmp\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:16.007474 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.007447 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcnt7\" (UniqueName: \"kubernetes.io/projected/e61cde97-43d8-438a-abf3-9be15c9fb8d0-kube-api-access-jcnt7\") pod \"iptables-alerter-c8f8g\" (UID: \"e61cde97-43d8-438a-abf3-9be15c9fb8d0\") " pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:16.007676 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.007520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g74q\" (UniqueName: \"kubernetes.io/projected/106d173d-f66c-4772-a2c9-07f5c1dc8219-kube-api-access-5g74q\") pod \"multus-pppcc\" (UID: \"106d173d-f66c-4772-a2c9-07f5c1dc8219\") " pod="openshift-multus/multus-pppcc" Apr 28 19:16:16.007888 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.007866 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7dgz\" (UniqueName: \"kubernetes.io/projected/9f33e20d-a27b-40b9-8c35-f0b6fe5302f4-kube-api-access-r7dgz\") pod \"tuned-hkxr5\" (UID: \"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4\") " pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:16.072780 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.072738 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" Apr 28 19:16:16.082573 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.082550 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nt8t9" Apr 28 19:16:16.092343 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.092308 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:16.098124 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.098101 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:16.105726 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.105706 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" Apr 28 19:16:16.113325 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.113297 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pppcc" Apr 28 19:16:16.121013 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.120990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" Apr 28 19:16:16.127618 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.127589 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c8f8g" Apr 28 19:16:16.395745 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.395672 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:16.395884 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:16.395807 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:16.395940 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:16.395909 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs podName:0b961ce3-ed85-40f4-840c-df0e74d830dd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.395886554 +0000 UTC m=+4.147436048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs") pod "network-metrics-daemon-qgcjb" (UID: "0b961ce3-ed85-40f4-840c-df0e74d830dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:16.496650 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.496600 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26gwd\" (UniqueName: \"kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd\") pod \"network-check-target-nc646\" (UID: \"633bd943-3978-4baf-be3b-c82a70d85512\") " pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:16.496819 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:16.496771 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:16.496819 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:16.496797 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:16.496819 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:16.496810 2570 projected.go:194] Error preparing data for projected volume kube-api-access-26gwd for pod openshift-network-diagnostics/network-check-target-nc646: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:16.496982 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:16.496875 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd podName:633bd943-3978-4baf-be3b-c82a70d85512 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.496855489 +0000 UTC m=+4.248404999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-26gwd" (UniqueName: "kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd") pod "network-check-target-nc646" (UID: "633bd943-3978-4baf-be3b-c82a70d85512") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:16.726117 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:16.726088 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod106d173d_f66c_4772_a2c9_07f5c1dc8219.slice/crio-711844d6542498f2eb06ad4bb1cb1ad871ca9b6c43acba2ff5f791132e61fe96 WatchSource:0}: Error finding container 711844d6542498f2eb06ad4bb1cb1ad871ca9b6c43acba2ff5f791132e61fe96: Status 404 returned error can't find the container with id 711844d6542498f2eb06ad4bb1cb1ad871ca9b6c43acba2ff5f791132e61fe96 Apr 28 19:16:16.727636 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:16.727593 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21172191_de03_4932_85fe_40437ea0c56a.slice/crio-465ceb83792ca9734699bbd5ec55d6c2551f5b2e78c2e61ca23225b582622ae6 WatchSource:0}: Error finding container 465ceb83792ca9734699bbd5ec55d6c2551f5b2e78c2e61ca23225b582622ae6: Status 404 returned error can't find the container with id 465ceb83792ca9734699bbd5ec55d6c2551f5b2e78c2e61ca23225b582622ae6 Apr 28 19:16:16.728536 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:16.728507 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15eb3509_1e81_4a17_b03f_275a88d26015.slice/crio-209c8486e70b4d6e24dd01e96cec945df832d2babbdec5c852b1d7f3b55ff479 WatchSource:0}: Error finding container 209c8486e70b4d6e24dd01e96cec945df832d2babbdec5c852b1d7f3b55ff479: Status 404 returned error can't find the container with id 209c8486e70b4d6e24dd01e96cec945df832d2babbdec5c852b1d7f3b55ff479 Apr 28 19:16:16.731004 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:16.730966 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f33e20d_a27b_40b9_8c35_f0b6fe5302f4.slice/crio-ffd0fc86ecfbeb12eeb88bec9277ac96dacf7a8f3df2b8cbf6daf63724e3e0b8 WatchSource:0}: Error finding container ffd0fc86ecfbeb12eeb88bec9277ac96dacf7a8f3df2b8cbf6daf63724e3e0b8: Status 404 returned error can't find the container with id ffd0fc86ecfbeb12eeb88bec9277ac96dacf7a8f3df2b8cbf6daf63724e3e0b8 Apr 28 19:16:16.731859 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:16.731836 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbecc1ca0_0b4e_43d1_95d2_8979c6dd35fd.slice/crio-1c21d74a7345c431e399cb184595396ce8380def574b81748509228d2e089bb6 WatchSource:0}: Error finding container 1c21d74a7345c431e399cb184595396ce8380def574b81748509228d2e089bb6: Status 404 returned error can't find the container with id 1c21d74a7345c431e399cb184595396ce8380def574b81748509228d2e089bb6 Apr 28 19:16:16.732693 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:16.732660 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61cde97_43d8_438a_abf3_9be15c9fb8d0.slice/crio-a32c566e023960c8a490a46f4ffc12da4d3feca54f77857761cda9cad0e97497 WatchSource:0}: Error finding container a32c566e023960c8a490a46f4ffc12da4d3feca54f77857761cda9cad0e97497: Status 404 returned error can't find the container with id a32c566e023960c8a490a46f4ffc12da4d3feca54f77857761cda9cad0e97497 Apr 28 19:16:16.734595 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:16.733866 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04deb9df_0bdd_4917_a722_41b25477b851.slice/crio-fd154327a606cb241c87b3320f206cc3f53ad66dfd851cf79d3c62be28bb0cdf WatchSource:0}: Error finding container fd154327a606cb241c87b3320f206cc3f53ad66dfd851cf79d3c62be28bb0cdf: Status 404 returned error can't find the container with id fd154327a606cb241c87b3320f206cc3f53ad66dfd851cf79d3c62be28bb0cdf Apr 28 19:16:16.833137 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.833106 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:14 +0000 UTC" deadline="2027-11-28 06:31:59.341888566 +0000 UTC" Apr 28 19:16:16.833137 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.833133 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13883h15m42.508758003s" Apr 28 19:16:16.900036 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.899978 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal" event={"ID":"b985e5bf011f3a2cbd34f60280660243","Type":"ContainerStarted","Data":"3bb2852d39453787fa85fe075c8833430ba285c46f8c2b0ed3e4b779731fb7c9"} Apr 28 19:16:16.901023 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.900997 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" event={"ID":"1d508319-bdc2-4029-88e0-8b8406c4ac0b","Type":"ContainerStarted","Data":"1835ab3bfa6a3a73651c4f429557eef57f46ef5bd83d896e76ef3c0bb9eb7255"} Apr 28 19:16:16.901977 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.901956 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" event={"ID":"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4","Type":"ContainerStarted","Data":"ffd0fc86ecfbeb12eeb88bec9277ac96dacf7a8f3df2b8cbf6daf63724e3e0b8"} Apr 28 19:16:16.902863 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.902833 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" event={"ID":"15eb3509-1e81-4a17-b03f-275a88d26015","Type":"ContainerStarted","Data":"209c8486e70b4d6e24dd01e96cec945df832d2babbdec5c852b1d7f3b55ff479"} Apr 28 19:16:16.903801 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.903779 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"465ceb83792ca9734699bbd5ec55d6c2551f5b2e78c2e61ca23225b582622ae6"} Apr 28 19:16:16.904676 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.904650 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nt8t9" event={"ID":"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd","Type":"ContainerStarted","Data":"1c21d74a7345c431e399cb184595396ce8380def574b81748509228d2e089bb6"} Apr 28 19:16:16.905560 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.905502 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-prm99" event={"ID":"04deb9df-0bdd-4917-a722-41b25477b851","Type":"ContainerStarted","Data":"fd154327a606cb241c87b3320f206cc3f53ad66dfd851cf79d3c62be28bb0cdf"} Apr 28 19:16:16.906399 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.906376 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c8f8g" event={"ID":"e61cde97-43d8-438a-abf3-9be15c9fb8d0","Type":"ContainerStarted","Data":"a32c566e023960c8a490a46f4ffc12da4d3feca54f77857761cda9cad0e97497"} Apr 28 19:16:16.907214 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:16.907196 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pppcc" event={"ID":"106d173d-f66c-4772-a2c9-07f5c1dc8219","Type":"ContainerStarted","Data":"711844d6542498f2eb06ad4bb1cb1ad871ca9b6c43acba2ff5f791132e61fe96"} Apr 28 19:16:17.404841 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:17.404133 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:17.404841 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:17.404310 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:17.404841 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:17.404382 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs podName:0b961ce3-ed85-40f4-840c-df0e74d830dd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.404360659 +0000 UTC m=+6.155910155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs") pod "network-metrics-daemon-qgcjb" (UID: "0b961ce3-ed85-40f4-840c-df0e74d830dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:17.504833 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:17.504800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26gwd\" (UniqueName: \"kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd\") pod \"network-check-target-nc646\" (UID: \"633bd943-3978-4baf-be3b-c82a70d85512\") " pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:17.504995 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:17.504976 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:17.504995 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:17.504995 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:17.505100 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:17.505008 2570 projected.go:194] Error preparing data for projected volume kube-api-access-26gwd for pod openshift-network-diagnostics/network-check-target-nc646: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:17.505100 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:17.505064 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd podName:633bd943-3978-4baf-be3b-c82a70d85512 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.505047154 +0000 UTC m=+6.256596650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-26gwd" (UniqueName: "kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd") pod "network-check-target-nc646" (UID: "633bd943-3978-4baf-be3b-c82a70d85512") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:17.886588 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:17.883648 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:17.886588 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:17.883775 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:17.886588 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:17.884158 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:17.886588 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:17.884273 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:17.926590 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:17.926555 2570 generic.go:358] "Generic (PLEG): container finished" podID="fe6a7892c477bb0d3d9ed823ee3c3a21" containerID="18a2e50705fc39c879069f0d988a1daccd3281de2ded9893fea03d02d6bf7934" exitCode=0 Apr 28 19:16:17.927441 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:17.927417 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" event={"ID":"fe6a7892c477bb0d3d9ed823ee3c3a21","Type":"ContainerDied","Data":"18a2e50705fc39c879069f0d988a1daccd3281de2ded9893fea03d02d6bf7934"} Apr 28 19:16:17.943858 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:17.943070 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-34.ec2.internal" podStartSLOduration=3.943052608 podStartE2EDuration="3.943052608s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:16.915938291 +0000 UTC m=+3.667487831" watchObservedRunningTime="2026-04-28 19:16:17.943052608 +0000 UTC m=+4.694602126" Apr 28 19:16:18.932429 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:18.931751 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" event={"ID":"fe6a7892c477bb0d3d9ed823ee3c3a21","Type":"ContainerStarted","Data":"0d5e8d24e65e438edea808f0f3c12746acfdb2d55cad4460a56fc6f2b7158191"} Apr 28 19:16:19.420798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:19.420230 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:19.420798 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:19.420379 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:19.420798 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:19.420450 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs podName:0b961ce3-ed85-40f4-840c-df0e74d830dd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.420430637 +0000 UTC m=+10.171980143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs") pod "network-metrics-daemon-qgcjb" (UID: "0b961ce3-ed85-40f4-840c-df0e74d830dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:19.521902 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:19.521249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26gwd\" (UniqueName: \"kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd\") pod \"network-check-target-nc646\" (UID: \"633bd943-3978-4baf-be3b-c82a70d85512\") " pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:19.521902 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:19.521447 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:19.521902 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:19.521468 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:19.521902 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:19.521482 2570 projected.go:194] Error preparing data for projected volume kube-api-access-26gwd for pod openshift-network-diagnostics/network-check-target-nc646: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:19.521902 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:19.521540 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd podName:633bd943-3978-4baf-be3b-c82a70d85512 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.521522709 +0000 UTC m=+10.273072222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-26gwd" (UniqueName: "kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd") pod "network-check-target-nc646" (UID: "633bd943-3978-4baf-be3b-c82a70d85512") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:19.883257 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:19.882951 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:19.883257 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:19.883093 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:19.883469 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:19.883440 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:19.883568 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:19.883545 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:21.484179 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.484008 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-34.ec2.internal" podStartSLOduration=7.483988728 podStartE2EDuration="7.483988728s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:18.948448982 +0000 UTC m=+5.699998498" watchObservedRunningTime="2026-04-28 19:16:21.483988728 +0000 UTC m=+8.235538244" Apr 28 19:16:21.484689 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.484562 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-z27h7"] Apr 28 19:16:21.491165 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.491139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:21.491303 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:21.491234 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:21.536037 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.535818 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d156b1b1-0d6c-49bf-b188-773ff892fbd2-kubelet-config\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:21.536037 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.535864 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d156b1b1-0d6c-49bf-b188-773ff892fbd2-dbus\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:21.536037 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.535951 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:21.636331 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.636292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d156b1b1-0d6c-49bf-b188-773ff892fbd2-kubelet-config\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:21.636498 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.636342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d156b1b1-0d6c-49bf-b188-773ff892fbd2-dbus\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:21.636498 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.636429 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:21.636596 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:21.636563 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:21.636672 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.636591 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d156b1b1-0d6c-49bf-b188-773ff892fbd2-kubelet-config\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:21.636672 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:21.636656 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret podName:d156b1b1-0d6c-49bf-b188-773ff892fbd2 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:22.136637623 +0000 UTC m=+8.888187135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret") pod "global-pull-secret-syncer-z27h7" (UID: "d156b1b1-0d6c-49bf-b188-773ff892fbd2") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:21.636772 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.636727 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d156b1b1-0d6c-49bf-b188-773ff892fbd2-dbus\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:21.886803 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.886329 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:21.886803 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:21.886353 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:21.886803 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:21.886448 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:21.886803 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:21.886649 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:22.141089 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:22.140995 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:22.141262 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:22.141147 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:22.141262 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:22.141223 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret podName:d156b1b1-0d6c-49bf-b188-773ff892fbd2 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.141202877 +0000 UTC m=+9.892752379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret") pod "global-pull-secret-syncer-z27h7" (UID: "d156b1b1-0d6c-49bf-b188-773ff892fbd2") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:23.150442 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:23.149847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:23.150442 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.149975 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:23.150442 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.150112 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret podName:d156b1b1-0d6c-49bf-b188-773ff892fbd2 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:25.150065079 +0000 UTC m=+11.901614578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret") pod "global-pull-secret-syncer-z27h7" (UID: "d156b1b1-0d6c-49bf-b188-773ff892fbd2") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:23.452290 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:23.452188 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:23.452430 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.452322 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:23.452430 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.452408 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs podName:0b961ce3-ed85-40f4-840c-df0e74d830dd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.452387047 +0000 UTC m=+18.203936554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs") pod "network-metrics-daemon-qgcjb" (UID: "0b961ce3-ed85-40f4-840c-df0e74d830dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:23.552994 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:23.552953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26gwd\" (UniqueName: \"kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd\") pod \"network-check-target-nc646\" (UID: \"633bd943-3978-4baf-be3b-c82a70d85512\") " pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:23.553224 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.553203 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:23.553295 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.553231 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:23.553295 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.553244 2570 projected.go:194] Error preparing data for projected volume kube-api-access-26gwd for pod openshift-network-diagnostics/network-check-target-nc646: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:23.553380 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.553310 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd podName:633bd943-3978-4baf-be3b-c82a70d85512 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.553291581 +0000 UTC m=+18.304841080 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-26gwd" (UniqueName: "kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd") pod "network-check-target-nc646" (UID: "633bd943-3978-4baf-be3b-c82a70d85512") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:23.884068 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:23.883493 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:23.884068 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.883641 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:23.884068 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:23.883811 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:23.884068 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.883906 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:23.884068 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:23.883994 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:23.884997 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:23.884926 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:25.166113 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:25.166063 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:25.166583 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:25.166212 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:25.166583 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:25.166288 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret podName:d156b1b1-0d6c-49bf-b188-773ff892fbd2 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:29.166267791 +0000 UTC m=+15.917817290 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret") pod "global-pull-secret-syncer-z27h7" (UID: "d156b1b1-0d6c-49bf-b188-773ff892fbd2") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:25.886008 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:25.885976 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:25.886202 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:25.885982 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:25.886202 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:25.886091 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:25.886202 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:25.885990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:25.886202 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:25.886177 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:25.886371 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:25.886228 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:27.885454 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:27.885422 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:27.885454 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:27.885436 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:27.885951 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:27.885545 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:27.885951 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:27.885634 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:27.885951 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:27.885654 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:27.885951 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:27.885732 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:29.195233 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:29.195148 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:29.195639 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:29.195295 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:29.195639 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:29.195365 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret podName:d156b1b1-0d6c-49bf-b188-773ff892fbd2 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:37.195351329 +0000 UTC m=+23.946900825 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret") pod "global-pull-secret-syncer-z27h7" (UID: "d156b1b1-0d6c-49bf-b188-773ff892fbd2") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:29.883130 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:29.883093 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:29.883130 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:29.883122 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:29.883365 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:29.883226 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:29.883365 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:29.883314 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:29.883472 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:29.883368 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:29.883472 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:29.883437 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:31.513527 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:31.513486 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:31.513935 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:31.513621 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:31.513935 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:31.513681 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs podName:0b961ce3-ed85-40f4-840c-df0e74d830dd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.513664571 +0000 UTC m=+34.265214079 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs") pod "network-metrics-daemon-qgcjb" (UID: "0b961ce3-ed85-40f4-840c-df0e74d830dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:31.614493 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:31.614446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26gwd\" (UniqueName: \"kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd\") pod \"network-check-target-nc646\" (UID: \"633bd943-3978-4baf-be3b-c82a70d85512\") " pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:31.614718 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:31.614664 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:31.614718 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:31.614692 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:31.614718 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:31.614706 2570 projected.go:194] Error preparing data for projected volume kube-api-access-26gwd for pod openshift-network-diagnostics/network-check-target-nc646: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:31.614859 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:31.614770 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd podName:633bd943-3978-4baf-be3b-c82a70d85512 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.6147525 +0000 UTC m=+34.366302005 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-26gwd" (UniqueName: "kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd") pod "network-check-target-nc646" (UID: "633bd943-3978-4baf-be3b-c82a70d85512") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:31.883670 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:31.883577 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:31.883819 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:31.883577 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:31.883819 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:31.883721 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:31.883946 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:31.883577 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:31.883946 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:31.883826 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:31.884046 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:31.883958 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:33.883591 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.883426 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:33.884182 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.883489 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:33.884182 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.883506 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:33.884182 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:33.883723 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:33.884182 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:33.883808 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:33.884182 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:33.883896 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:33.964390 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.964182 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"d35f0c6ed820f1b60344c9a422878f16fce3a7caed0d03fa349b4ec63b2b608f"} Apr 28 19:16:33.964490 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.964409 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"2f94e16203fc5bb09b2e09e24d5b42177b48bc09c4b95bfe4effca5856ae2d2e"} Apr 28 19:16:33.964490 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.964430 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"31b4e2a23476334aa5281507f84f45fc4a21f4e19815ef156a3edaa7f4d24fdb"} Apr 28 19:16:33.965350 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.965327 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nt8t9" event={"ID":"becc1ca0-0b4e-43d1-95d2-8979c6dd35fd","Type":"ContainerStarted","Data":"187fd8f14f04ea6083fa49e0ca0f218f35dd1f518013c03ade36d2d48a38931e"} Apr 28 19:16:33.966516 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.966468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-prm99" event={"ID":"04deb9df-0bdd-4917-a722-41b25477b851","Type":"ContainerStarted","Data":"1b4aa1a7f4c715f082991c00dd01610043d889bc443fa4bca0470056729ba62a"} Apr 28 19:16:33.967856 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.967833 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pppcc" event={"ID":"106d173d-f66c-4772-a2c9-07f5c1dc8219","Type":"ContainerStarted","Data":"2ea004b1455abbecd13770230215434b89fefde96ee00a026dd956f2ba32e136"} Apr 28 19:16:33.969172 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.969148 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" event={"ID":"1d508319-bdc2-4029-88e0-8b8406c4ac0b","Type":"ContainerStarted","Data":"17a5eee37710bdc6eaf1b5eac6507d3613d5720884dfe2312fb6f6171a63d72d"} Apr 28 19:16:33.970212 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.970192 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" event={"ID":"9f33e20d-a27b-40b9-8c35-f0b6fe5302f4","Type":"ContainerStarted","Data":"f68461da08ad8acd79a55da35e147e6c9ed02bc18fc130a21db40b0b78e629f3"} Apr 28 19:16:33.971311 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:33.971293 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" event={"ID":"15eb3509-1e81-4a17-b03f-275a88d26015","Type":"ContainerStarted","Data":"f324ccf64711d9b1905e8973800ba98b86662a0c247ec0878cbd775f62492eed"} Apr 28 19:16:34.000874 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.000828 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hkxr5" podStartSLOduration=3.187165576 podStartE2EDuration="20.000812907s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.73285785 +0000 UTC m=+3.484407348" lastFinishedPulling="2026-04-28 19:16:33.546505172 +0000 UTC m=+20.298054679" observedRunningTime="2026-04-28 19:16:34.000797825 +0000 UTC m=+20.752347360" watchObservedRunningTime="2026-04-28 19:16:34.000812907 +0000 UTC m=+20.752362422" Apr 28 19:16:34.001095 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.001075 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nt8t9" podStartSLOduration=3.188815167 podStartE2EDuration="20.001069802s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.734344114 +0000 UTC m=+3.485893618" lastFinishedPulling="2026-04-28 19:16:33.546598754 +0000 UTC m=+20.298148253" observedRunningTime="2026-04-28 19:16:33.985107926 +0000 UTC m=+20.736657442" watchObservedRunningTime="2026-04-28 19:16:34.001069802 +0000 UTC m=+20.752619318" Apr 28 19:16:34.017734 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.017691 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pppcc" podStartSLOduration=2.885658494 podStartE2EDuration="20.017674465s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.727909725 +0000 UTC m=+3.479459218" lastFinishedPulling="2026-04-28 19:16:33.859925693 +0000 UTC m=+20.611475189" observedRunningTime="2026-04-28 19:16:34.017490808 +0000 UTC m=+20.769040323" watchObservedRunningTime="2026-04-28 19:16:34.017674465 +0000 UTC m=+20.769223979" Apr 28 19:16:34.033013 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.032967 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-prm99" podStartSLOduration=3.247858077 podStartE2EDuration="20.03295353s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.737113236 +0000 UTC m=+3.488662743" lastFinishedPulling="2026-04-28 19:16:33.522208703 +0000 UTC m=+20.273758196" observedRunningTime="2026-04-28 19:16:34.032531335 +0000 UTC m=+20.784080849" watchObservedRunningTime="2026-04-28 19:16:34.03295353 +0000 UTC m=+20.784503059" Apr 28 19:16:34.976301 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.976276 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:16:34.976803 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.976578 2570 generic.go:358] "Generic (PLEG): container finished" podID="21172191-de03-4932-85fe-40437ea0c56a" containerID="2f94e16203fc5bb09b2e09e24d5b42177b48bc09c4b95bfe4effca5856ae2d2e" exitCode=1 Apr 28 19:16:34.976803 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.976630 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerDied","Data":"2f94e16203fc5bb09b2e09e24d5b42177b48bc09c4b95bfe4effca5856ae2d2e"} Apr 28 19:16:34.976803 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.976658 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"98463d3bbc23b0fd1ba11403cef62bcf48b8f76847909f901cbc77dad12ba533"} Apr 28 19:16:34.976803 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.976667 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"92dbc6fee5c1fbdb2c911f6bfd14fa10f1d583fd7a82d4f1eb221291cbebaebf"} Apr 28 19:16:34.976803 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.976675 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"33b7c9f901e2d3b5dbc755e24bb891fd8e609959b207e82788a9038c4ad8da38"} Apr 28 19:16:34.977945 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.977922 2570 generic.go:358] "Generic (PLEG): container finished" podID="1d508319-bdc2-4029-88e0-8b8406c4ac0b" containerID="17a5eee37710bdc6eaf1b5eac6507d3613d5720884dfe2312fb6f6171a63d72d" exitCode=0 Apr 28 19:16:34.978097 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:34.978064 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" event={"ID":"1d508319-bdc2-4029-88e0-8b8406c4ac0b","Type":"ContainerDied","Data":"17a5eee37710bdc6eaf1b5eac6507d3613d5720884dfe2312fb6f6171a63d72d"} Apr 28 19:16:35.146236 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.146064 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:16:35.861586 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.861475 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:16:35.146230368Z","UUID":"163ab196-c206-4e05-ab40-39178fed752b","Handler":null,"Name":"","Endpoint":""} Apr 28 19:16:35.864889 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.864866 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:16:35.864889 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.864894 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:16:35.883564 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.883539 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:35.883767 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:35.883743 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:35.883827 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.883546 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:35.883827 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.883539 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:35.883901 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:35.883880 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:35.883976 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:35.883959 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:35.973182 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.973146 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:35.973874 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.973843 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:35.981556 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.981515 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" event={"ID":"15eb3509-1e81-4a17-b03f-275a88d26015","Type":"ContainerStarted","Data":"f4b0b2b852bbf2640b0c0125401faea4c94948f2a5a191d2ebbf96aeb2374a19"} Apr 28 19:16:35.983189 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:35.983155 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c8f8g" event={"ID":"e61cde97-43d8-438a-abf3-9be15c9fb8d0","Type":"ContainerStarted","Data":"c2df53b68661e38c5b36694401afac1baa4011519c606594cfba4c586b97bcd4"} Apr 28 19:16:36.006659 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:36.006582 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-c8f8g" podStartSLOduration=5.197190033 podStartE2EDuration="22.006567672s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.737149249 +0000 UTC m=+3.488698744" lastFinishedPulling="2026-04-28 19:16:33.54652689 +0000 UTC m=+20.298076383" observedRunningTime="2026-04-28 19:16:36.00617744 +0000 UTC m=+22.757726956" watchObservedRunningTime="2026-04-28 19:16:36.006567672 +0000 UTC m=+22.758117224" Apr 28 19:16:36.987741 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:36.987652 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" event={"ID":"15eb3509-1e81-4a17-b03f-275a88d26015","Type":"ContainerStarted","Data":"21323d52163c3746b297d908eb6a6e661bc0d705069ef7ea4b529a4991aa6729"} Apr 28 19:16:36.990743 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:36.990722 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:16:36.991164 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:36.991141 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"0a15d6b0ca274e5db2903a9315cd5078ab61c38b0922a54cb0d588ffb45b49eb"} Apr 28 19:16:36.991311 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:36.991295 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 28 19:16:37.014018 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.013982 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pmpq5"] Apr 28 19:16:37.032151 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.032090 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xkkkw" podStartSLOduration=3.305783796 podStartE2EDuration="23.032073353s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.730522702 +0000 UTC m=+3.482072195" lastFinishedPulling="2026-04-28 19:16:36.456812256 +0000 UTC m=+23.208361752" observedRunningTime="2026-04-28 19:16:37.029369254 +0000 UTC m=+23.780918768" watchObservedRunningTime="2026-04-28 19:16:37.032073353 +0000 UTC m=+23.783622893" Apr 28 19:16:37.034378 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.034346 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.038279 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.038242 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:16:37.038554 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.038538 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pvc6r\"" Apr 28 19:16:37.038907 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.038883 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:16:37.153267 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.153225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/68ecdbce-5bdd-408b-bef6-91e797899886-tmp-dir\") pod \"node-resolver-pmpq5\" (UID: \"68ecdbce-5bdd-408b-bef6-91e797899886\") " pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.153440 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.153414 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/68ecdbce-5bdd-408b-bef6-91e797899886-hosts-file\") pod \"node-resolver-pmpq5\" (UID: \"68ecdbce-5bdd-408b-bef6-91e797899886\") " pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.153496 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.153459 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqv76\" (UniqueName: \"kubernetes.io/projected/68ecdbce-5bdd-408b-bef6-91e797899886-kube-api-access-lqv76\") pod \"node-resolver-pmpq5\" (UID: \"68ecdbce-5bdd-408b-bef6-91e797899886\") " pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.254310 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.254228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/68ecdbce-5bdd-408b-bef6-91e797899886-hosts-file\") pod \"node-resolver-pmpq5\" (UID: \"68ecdbce-5bdd-408b-bef6-91e797899886\") " pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.254310 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.254267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqv76\" (UniqueName: \"kubernetes.io/projected/68ecdbce-5bdd-408b-bef6-91e797899886-kube-api-access-lqv76\") pod \"node-resolver-pmpq5\" (UID: \"68ecdbce-5bdd-408b-bef6-91e797899886\") " pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.254310 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.254292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/68ecdbce-5bdd-408b-bef6-91e797899886-tmp-dir\") pod \"node-resolver-pmpq5\" (UID: \"68ecdbce-5bdd-408b-bef6-91e797899886\") " pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.254530 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.254334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:37.254530 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.254386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/68ecdbce-5bdd-408b-bef6-91e797899886-hosts-file\") pod \"node-resolver-pmpq5\" (UID: \"68ecdbce-5bdd-408b-bef6-91e797899886\") " pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.254530 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:37.254479 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:37.254677 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:37.254542 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret podName:d156b1b1-0d6c-49bf-b188-773ff892fbd2 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:53.254523804 +0000 UTC m=+40.006073303 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret") pod "global-pull-secret-syncer-z27h7" (UID: "d156b1b1-0d6c-49bf-b188-773ff892fbd2") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:37.254859 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.254833 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/68ecdbce-5bdd-408b-bef6-91e797899886-tmp-dir\") pod \"node-resolver-pmpq5\" (UID: \"68ecdbce-5bdd-408b-bef6-91e797899886\") " pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.266853 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.266824 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqv76\" (UniqueName: \"kubernetes.io/projected/68ecdbce-5bdd-408b-bef6-91e797899886-kube-api-access-lqv76\") pod \"node-resolver-pmpq5\" (UID: \"68ecdbce-5bdd-408b-bef6-91e797899886\") " pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.344725 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.344687 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pmpq5" Apr 28 19:16:37.355837 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:37.355806 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68ecdbce_5bdd_408b_bef6_91e797899886.slice/crio-44939b6802a16d157aa2d0dab59719ad756c0628612da586eb914a876e9114cc WatchSource:0}: Error finding container 44939b6802a16d157aa2d0dab59719ad756c0628612da586eb914a876e9114cc: Status 404 returned error can't find the container with id 44939b6802a16d157aa2d0dab59719ad756c0628612da586eb914a876e9114cc Apr 28 19:16:37.883815 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.883766 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:37.884011 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.883766 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:37.884011 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:37.883893 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:37.884011 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:37.883969 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:37.884011 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.883780 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:37.884180 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:37.884081 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:37.994323 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.994277 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pmpq5" event={"ID":"68ecdbce-5bdd-408b-bef6-91e797899886","Type":"ContainerStarted","Data":"bbe0b724bea6cd48e74bbbb1665820ccfbf2ada03c35eede98b8308355e6ff72"} Apr 28 19:16:37.994323 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:37.994320 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pmpq5" event={"ID":"68ecdbce-5bdd-408b-bef6-91e797899886","Type":"ContainerStarted","Data":"44939b6802a16d157aa2d0dab59719ad756c0628612da586eb914a876e9114cc"} Apr 28 19:16:38.011128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:38.011065 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pmpq5" podStartSLOduration=2.011049511 podStartE2EDuration="2.011049511s" podCreationTimestamp="2026-04-28 19:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:38.010904326 +0000 UTC m=+24.762453842" watchObservedRunningTime="2026-04-28 19:16:38.011049511 +0000 UTC m=+24.762599026" Apr 28 19:16:39.883460 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:39.883424 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:39.883460 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:39.883443 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:39.883995 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:39.883534 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:39.883995 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:39.883626 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:39.883995 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:39.883642 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:39.883995 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:39.883733 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:39.998522 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:39.998493 2570 generic.go:358] "Generic (PLEG): container finished" podID="1d508319-bdc2-4029-88e0-8b8406c4ac0b" containerID="700be7beca9ec7d3d20064192e6a8b4a4aa8dea787f9ff53859831bfebe429bb" exitCode=0 Apr 28 19:16:39.998690 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:39.998584 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" event={"ID":"1d508319-bdc2-4029-88e0-8b8406c4ac0b","Type":"ContainerDied","Data":"700be7beca9ec7d3d20064192e6a8b4a4aa8dea787f9ff53859831bfebe429bb"} Apr 28 19:16:40.001681 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:40.001661 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:16:40.002037 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:40.002016 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"2bc5b588163686355ce92dfe9a0125dbe674ecc74f5a2d173c569b740b0b2685"} Apr 28 19:16:40.002370 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:40.002347 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:40.002454 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:40.002379 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:40.002524 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:40.002507 2570 scope.go:117] "RemoveContainer" containerID="2f94e16203fc5bb09b2e09e24d5b42177b48bc09c4b95bfe4effca5856ae2d2e" Apr 28 19:16:40.018439 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:40.018414 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:41.001276 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.001017 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z27h7"] Apr 28 19:16:41.001796 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.001328 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:41.001796 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:41.001416 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:41.004197 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.004152 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qgcjb"] Apr 28 19:16:41.004347 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.004290 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:41.004414 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:41.004388 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:41.006186 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.006159 2570 generic.go:358] "Generic (PLEG): container finished" podID="1d508319-bdc2-4029-88e0-8b8406c4ac0b" containerID="9e6bde88294edcd1bb04077c2905ad4c13688efdd4627d7ef19fd1866d870cb5" exitCode=0 Apr 28 19:16:41.006328 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.006244 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" event={"ID":"1d508319-bdc2-4029-88e0-8b8406c4ac0b","Type":"ContainerDied","Data":"9e6bde88294edcd1bb04077c2905ad4c13688efdd4627d7ef19fd1866d870cb5"} Apr 28 19:16:41.010512 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.010490 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:16:41.010861 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.010838 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" event={"ID":"21172191-de03-4932-85fe-40437ea0c56a","Type":"ContainerStarted","Data":"41b0fff819fe0307d3294bb1dbc2c9e3596dc9ff516db70d96c4c0a9ac9ee0a7"} Apr 28 19:16:41.011066 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.011052 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:41.014135 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.014099 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nc646"] Apr 28 19:16:41.014299 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.014230 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:41.014369 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:41.014310 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:41.027361 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:41.027334 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:16:42.015108 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:42.015077 2570 generic.go:358] "Generic (PLEG): container finished" podID="1d508319-bdc2-4029-88e0-8b8406c4ac0b" containerID="e67be09385c201891caed52ee11935e077fdc953be2ad25f4dcba2b8ee9b2bb9" exitCode=0 Apr 28 19:16:42.015490 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:42.015161 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" event={"ID":"1d508319-bdc2-4029-88e0-8b8406c4ac0b","Type":"ContainerDied","Data":"e67be09385c201891caed52ee11935e077fdc953be2ad25f4dcba2b8ee9b2bb9"} Apr 28 19:16:42.042817 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:42.042726 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" podStartSLOduration=11.151991045 podStartE2EDuration="28.042708983s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.730140666 +0000 UTC m=+3.481690162" lastFinishedPulling="2026-04-28 19:16:33.620858608 +0000 UTC m=+20.372408100" observedRunningTime="2026-04-28 19:16:41.070161495 +0000 UTC m=+27.821711009" watchObservedRunningTime="2026-04-28 19:16:42.042708983 +0000 UTC m=+28.794258493" Apr 28 19:16:42.107933 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:42.107896 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:42.108110 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:42.108049 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 28 19:16:42.108576 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:42.108556 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-prm99" Apr 28 19:16:42.884281 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:42.883820 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:42.884281 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:42.883949 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:42.884281 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:42.883985 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:42.884281 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:42.884104 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:42.884281 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:42.884158 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:42.884281 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:42.884239 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:44.883704 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:44.883666 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:44.884324 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:44.883715 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:44.884324 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:44.883685 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:44.884324 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:44.883795 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:16:44.884324 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:44.883883 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z27h7" podUID="d156b1b1-0d6c-49bf-b188-773ff892fbd2" Apr 28 19:16:44.884324 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:44.883942 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nc646" podUID="633bd943-3978-4baf-be3b-c82a70d85512" Apr 28 19:16:46.560453 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.560418 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-34.ec2.internal" event="NodeReady" Apr 28 19:16:46.560921 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.560563 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:16:46.601864 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.601788 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-x865s"] Apr 28 19:16:46.605839 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.605816 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-745bf7dd44-hft5k"] Apr 28 19:16:46.605989 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.605963 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:46.608590 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.608564 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 28 19:16:46.608716 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.608621 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-lfxd4\"" Apr 28 19:16:46.608876 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.608852 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.609322 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.609284 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 28 19:16:46.612379 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.612362 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 28 19:16:46.613330 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.613172 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 28 19:16:46.614063 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.613880 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 28 19:16:46.614063 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.613989 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-x865s"] Apr 28 19:16:46.614563 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.614209 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6x2jf\"" Apr 28 19:16:46.618951 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.618934 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-98b4d"] Apr 28 19:16:46.620887 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.620090 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 28 19:16:46.622030 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.622009 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-745bf7dd44-hft5k"] Apr 28 19:16:46.622148 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.622133 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.624966 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.624441 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:16:46.624966 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.624665 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:16:46.624966 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.624867 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6qdcw\"" Apr 28 19:16:46.630538 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.630516 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-98b4d"] Apr 28 19:16:46.632867 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.632847 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l8dqg"] Apr 28 19:16:46.636112 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.636090 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:46.638567 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.638547 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:16:46.639154 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.639137 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:16:46.639245 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.639185 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pzmft\"" Apr 28 19:16:46.640011 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.639992 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:16:46.645778 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.645744 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l8dqg"] Apr 28 19:16:46.725617 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725571 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.725798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725644 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-image-registry-private-configuration\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.725798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725678 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:46.725798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725702 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:46.725798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725726 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-trusted-ca\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.725798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725747 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlbz6\" (UniqueName: \"kubernetes.io/projected/0d073d08-7217-4136-8485-03d574acfc52-kube-api-access-dlbz6\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:46.725798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725768 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/801789ef-975e-451d-9e18-0cb9acd739d6-tmp-dir\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.725798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725791 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.726117 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725815 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9622feba-01a7-434b-84bb-f677851aaa37-ca-trust-extracted\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.726117 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725911 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-installation-pull-secrets\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.726117 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725939 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smz6v\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-kube-api-access-smz6v\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.726117 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.725968 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2mqm\" (UniqueName: \"kubernetes.io/projected/801789ef-975e-451d-9e18-0cb9acd739d6-kube-api-access-b2mqm\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.726117 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.726041 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-bound-sa-token\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.726117 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.726091 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-registry-certificates\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.726117 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.726115 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:46.726373 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.726145 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801789ef-975e-451d-9e18-0cb9acd739d6-config-volume\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.826969 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.826924 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-image-registry-private-configuration\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.827158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.826982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:46.827158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827021 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:46.827158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-trusted-ca\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.827158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlbz6\" (UniqueName: \"kubernetes.io/projected/0d073d08-7217-4136-8485-03d574acfc52-kube-api-access-dlbz6\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:46.827158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827094 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/801789ef-975e-451d-9e18-0cb9acd739d6-tmp-dir\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.827158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.827158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827145 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9622feba-01a7-434b-84bb-f677851aaa37-ca-trust-extracted\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:46.827160 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-installation-pull-secrets\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:46.827240 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert podName:0d073d08-7217-4136-8485-03d574acfc52 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.32721898 +0000 UTC m=+34.078768487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert") pod "ingress-canary-l8dqg" (UID: "0d073d08-7217-4136-8485-03d574acfc52") : secret "canary-serving-cert" not found Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827268 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smz6v\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-kube-api-access-smz6v\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827310 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2mqm\" (UniqueName: \"kubernetes.io/projected/801789ef-975e-451d-9e18-0cb9acd739d6-kube-api-access-b2mqm\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-bound-sa-token\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-registry-certificates\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827420 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827448 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801789ef-975e-451d-9e18-0cb9acd739d6-config-volume\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/801789ef-975e-451d-9e18-0cb9acd739d6-tmp-dir\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.827514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.827506 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.828082 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:46.827715 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:46.828082 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:46.827736 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-745bf7dd44-hft5k: secret "image-registry-tls" not found Apr 28 19:16:46.828082 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:46.827791 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls podName:9622feba-01a7-434b-84bb-f677851aaa37 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.327771348 +0000 UTC m=+34.079320859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls") pod "image-registry-745bf7dd44-hft5k" (UID: "9622feba-01a7-434b-84bb-f677851aaa37") : secret "image-registry-tls" not found Apr 28 19:16:46.828232 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:46.828169 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:46.828232 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.828173 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-trusted-ca\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.828232 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.828188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:46.828232 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:46.828214 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert podName:7e4b5b64-ebf0-4ee2-a43b-35098459ff73 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.328199792 +0000 UTC m=+34.079749287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-x865s" (UID: "7e4b5b64-ebf0-4ee2-a43b-35098459ff73") : secret "networking-console-plugin-cert" not found Apr 28 19:16:46.828429 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:46.828286 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:46.828429 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.828309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9622feba-01a7-434b-84bb-f677851aaa37-ca-trust-extracted\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.828429 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:46.828331 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls podName:801789ef-975e-451d-9e18-0cb9acd739d6 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.328317062 +0000 UTC m=+34.079866559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls") pod "dns-default-98b4d" (UID: "801789ef-975e-451d-9e18-0cb9acd739d6") : secret "dns-default-metrics-tls" not found Apr 28 19:16:46.828801 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.828776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801789ef-975e-451d-9e18-0cb9acd739d6-config-volume\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.829287 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.829264 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-registry-certificates\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.832567 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.832542 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-image-registry-private-configuration\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.832671 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.832546 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-installation-pull-secrets\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.839234 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.839209 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2mqm\" (UniqueName: \"kubernetes.io/projected/801789ef-975e-451d-9e18-0cb9acd739d6-kube-api-access-b2mqm\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:46.839409 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.839351 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-bound-sa-token\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.839787 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.839765 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smz6v\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-kube-api-access-smz6v\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:46.840057 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.840040 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlbz6\" (UniqueName: \"kubernetes.io/projected/0d073d08-7217-4136-8485-03d574acfc52-kube-api-access-dlbz6\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:46.883275 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.883195 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:46.883410 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.883300 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:46.883410 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.883399 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:46.886824 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.886804 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wz9tz\"" Apr 28 19:16:46.886954 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.886883 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tslnt\"" Apr 28 19:16:46.886954 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.886908 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:46.886954 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.886941 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:16:46.887108 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.886906 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:46.887108 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:46.887002 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:47.332468 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:47.332432 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:47.332494 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:47.332524 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:47.332551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.332625 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.332648 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.332667 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.332710 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert podName:7e4b5b64-ebf0-4ee2-a43b-35098459ff73 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.332688159 +0000 UTC m=+35.084237668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-x865s" (UID: "7e4b5b64-ebf0-4ee2-a43b-35098459ff73") : secret "networking-console-plugin-cert" not found Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.332715 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.332731 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls podName:801789ef-975e-451d-9e18-0cb9acd739d6 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.332720381 +0000 UTC m=+35.084269874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls") pod "dns-default-98b4d" (UID: "801789ef-975e-451d-9e18-0cb9acd739d6") : secret "dns-default-metrics-tls" not found Apr 28 19:16:47.332729 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.332733 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-745bf7dd44-hft5k: secret "image-registry-tls" not found Apr 28 19:16:47.333125 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.332749 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert podName:0d073d08-7217-4136-8485-03d574acfc52 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.33273843 +0000 UTC m=+35.084287935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert") pod "ingress-canary-l8dqg" (UID: "0d073d08-7217-4136-8485-03d574acfc52") : secret "canary-serving-cert" not found Apr 28 19:16:47.333125 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.332790 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls podName:9622feba-01a7-434b-84bb-f677851aaa37 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.332774086 +0000 UTC m=+35.084323596 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls") pod "image-registry-745bf7dd44-hft5k" (UID: "9622feba-01a7-434b-84bb-f677851aaa37") : secret "image-registry-tls" not found Apr 28 19:16:47.534517 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:47.534475 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:16:47.534793 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.534641 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:16:47.534793 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:47.534712 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs podName:0b961ce3-ed85-40f4-840c-df0e74d830dd nodeName:}" failed. No retries permitted until 2026-04-28 19:17:19.534696373 +0000 UTC m=+66.286245866 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs") pod "network-metrics-daemon-qgcjb" (UID: "0b961ce3-ed85-40f4-840c-df0e74d830dd") : secret "metrics-daemon-secret" not found Apr 28 19:16:47.635382 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:47.635352 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26gwd\" (UniqueName: \"kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd\") pod \"network-check-target-nc646\" (UID: \"633bd943-3978-4baf-be3b-c82a70d85512\") " pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:47.639107 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:47.639084 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gwd\" (UniqueName: \"kubernetes.io/projected/633bd943-3978-4baf-be3b-c82a70d85512-kube-api-access-26gwd\") pod \"network-check-target-nc646\" (UID: \"633bd943-3978-4baf-be3b-c82a70d85512\") " pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:47.794476 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:47.794450 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:47.971912 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:47.971700 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nc646"] Apr 28 19:16:47.974092 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:47.974067 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod633bd943_3978_4baf_be3b_c82a70d85512.slice/crio-3d8bf6c3d5a2a5a6363d07b1df1cfa75ee2679d111adfc1cea5bb9e7f06bd83a WatchSource:0}: Error finding container 3d8bf6c3d5a2a5a6363d07b1df1cfa75ee2679d111adfc1cea5bb9e7f06bd83a: Status 404 returned error can't find the container with id 3d8bf6c3d5a2a5a6363d07b1df1cfa75ee2679d111adfc1cea5bb9e7f06bd83a Apr 28 19:16:48.029676 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:48.029645 2570 generic.go:358] "Generic (PLEG): container finished" podID="1d508319-bdc2-4029-88e0-8b8406c4ac0b" containerID="47244212aec242d27621eb98620e3ea620698fb2a721f0ece4b3dc1fb0cfca8f" exitCode=0 Apr 28 19:16:48.029831 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:48.029732 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" event={"ID":"1d508319-bdc2-4029-88e0-8b8406c4ac0b","Type":"ContainerDied","Data":"47244212aec242d27621eb98620e3ea620698fb2a721f0ece4b3dc1fb0cfca8f"} Apr 28 19:16:48.030777 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:48.030749 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nc646" event={"ID":"633bd943-3978-4baf-be3b-c82a70d85512","Type":"ContainerStarted","Data":"3d8bf6c3d5a2a5a6363d07b1df1cfa75ee2679d111adfc1cea5bb9e7f06bd83a"} Apr 28 19:16:48.340966 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:48.340926 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:48.341128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:48.340994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:48.341128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:48.341041 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:48.341128 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:48.341065 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:48.341128 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:48.341128 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert podName:7e4b5b64-ebf0-4ee2-a43b-35098459ff73 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:50.341114008 +0000 UTC m=+37.092663501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-x865s" (UID: "7e4b5b64-ebf0-4ee2-a43b-35098459ff73") : secret "networking-console-plugin-cert" not found Apr 28 19:16:48.341329 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:48.341127 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:48.341329 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:48.341140 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:48.341329 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:48.341154 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-745bf7dd44-hft5k: secret "image-registry-tls" not found Apr 28 19:16:48.341329 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:48.341176 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls podName:801789ef-975e-451d-9e18-0cb9acd739d6 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:50.341164393 +0000 UTC m=+37.092713886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls") pod "dns-default-98b4d" (UID: "801789ef-975e-451d-9e18-0cb9acd739d6") : secret "dns-default-metrics-tls" not found Apr 28 19:16:48.341329 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:48.341132 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:48.341329 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:48.341195 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls podName:9622feba-01a7-434b-84bb-f677851aaa37 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:50.341181844 +0000 UTC m=+37.092731343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls") pod "image-registry-745bf7dd44-hft5k" (UID: "9622feba-01a7-434b-84bb-f677851aaa37") : secret "image-registry-tls" not found Apr 28 19:16:48.341329 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:48.341073 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:48.341329 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:48.341218 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert podName:0d073d08-7217-4136-8485-03d574acfc52 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:50.341201362 +0000 UTC m=+37.092750855 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert") pod "ingress-canary-l8dqg" (UID: "0d073d08-7217-4136-8485-03d574acfc52") : secret "canary-serving-cert" not found Apr 28 19:16:49.035957 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:49.035922 2570 generic.go:358] "Generic (PLEG): container finished" podID="1d508319-bdc2-4029-88e0-8b8406c4ac0b" containerID="75cea194709fd1eea7d9ea3a76d0cf1b4e2224d57d38a432c4b16cb1036b7a19" exitCode=0 Apr 28 19:16:49.036338 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:49.035957 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" event={"ID":"1d508319-bdc2-4029-88e0-8b8406c4ac0b","Type":"ContainerDied","Data":"75cea194709fd1eea7d9ea3a76d0cf1b4e2224d57d38a432c4b16cb1036b7a19"} Apr 28 19:16:50.042065 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:50.042026 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" event={"ID":"1d508319-bdc2-4029-88e0-8b8406c4ac0b","Type":"ContainerStarted","Data":"d56f023a558b3e3d00f44ec1556aac00718c508ee8a878b472b7a8bf279a42ba"} Apr 28 19:16:50.068901 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:50.068843 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g9c9n" podStartSLOduration=6.20649699 podStartE2EDuration="37.0688273s" podCreationTimestamp="2026-04-28 19:16:13 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.738099657 +0000 UTC m=+3.489649165" lastFinishedPulling="2026-04-28 19:16:47.600429978 +0000 UTC m=+34.351979475" observedRunningTime="2026-04-28 19:16:50.067517205 +0000 UTC m=+36.819066719" watchObservedRunningTime="2026-04-28 19:16:50.0688273 +0000 UTC m=+36.820376815" Apr 28 19:16:50.358500 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:50.358394 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:50.358500 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:50.358468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:50.358736 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:50.358506 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:50.358736 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:50.358537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:50.358736 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:50.358680 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:50.358736 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:50.358696 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-745bf7dd44-hft5k: secret "image-registry-tls" not found Apr 28 19:16:50.358927 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:50.358757 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls podName:9622feba-01a7-434b-84bb-f677851aaa37 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:54.358736685 +0000 UTC m=+41.110286177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls") pod "image-registry-745bf7dd44-hft5k" (UID: "9622feba-01a7-434b-84bb-f677851aaa37") : secret "image-registry-tls" not found Apr 28 19:16:50.359097 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:50.359036 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:50.359216 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:50.359111 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls podName:801789ef-975e-451d-9e18-0cb9acd739d6 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:54.359093493 +0000 UTC m=+41.110643006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls") pod "dns-default-98b4d" (UID: "801789ef-975e-451d-9e18-0cb9acd739d6") : secret "dns-default-metrics-tls" not found Apr 28 19:16:50.359216 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:50.359039 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:50.359216 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:50.359159 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert podName:0d073d08-7217-4136-8485-03d574acfc52 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:54.359148934 +0000 UTC m=+41.110698442 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert") pod "ingress-canary-l8dqg" (UID: "0d073d08-7217-4136-8485-03d574acfc52") : secret "canary-serving-cert" not found Apr 28 19:16:50.359216 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:50.359049 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:50.359216 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:50.359203 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert podName:7e4b5b64-ebf0-4ee2-a43b-35098459ff73 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:54.359194561 +0000 UTC m=+41.110744056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-x865s" (UID: "7e4b5b64-ebf0-4ee2-a43b-35098459ff73") : secret "networking-console-plugin-cert" not found Apr 28 19:16:52.047133 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:52.047096 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nc646" event={"ID":"633bd943-3978-4baf-be3b-c82a70d85512","Type":"ContainerStarted","Data":"13acfb8d27e1f50464ddefbb50f8a55e0f3269f4130138c4d86b1418e83c828c"} Apr 28 19:16:52.047636 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:52.047242 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:16:52.065270 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:52.065224 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nc646" podStartSLOduration=34.438015807 podStartE2EDuration="38.065209812s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:47.976420115 +0000 UTC m=+34.727969622" lastFinishedPulling="2026-04-28 19:16:51.603614121 +0000 UTC m=+38.355163627" observedRunningTime="2026-04-28 19:16:52.064106702 +0000 UTC m=+38.815656216" watchObservedRunningTime="2026-04-28 19:16:52.065209812 +0000 UTC m=+38.816759326" Apr 28 19:16:53.283027 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:53.282990 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:53.286130 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:53.286103 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d156b1b1-0d6c-49bf-b188-773ff892fbd2-original-pull-secret\") pod \"global-pull-secret-syncer-z27h7\" (UID: \"d156b1b1-0d6c-49bf-b188-773ff892fbd2\") " pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:53.502350 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:53.502317 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z27h7" Apr 28 19:16:53.633528 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:53.633498 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z27h7"] Apr 28 19:16:53.635685 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:16:53.635650 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd156b1b1_0d6c_49bf_b188_773ff892fbd2.slice/crio-9e2dff4afa80b9b5348ffc8e3a4e02c28af0def80764e13faa1c00b6ac57780f WatchSource:0}: Error finding container 9e2dff4afa80b9b5348ffc8e3a4e02c28af0def80764e13faa1c00b6ac57780f: Status 404 returned error can't find the container with id 9e2dff4afa80b9b5348ffc8e3a4e02c28af0def80764e13faa1c00b6ac57780f Apr 28 19:16:54.051351 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:54.051315 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z27h7" event={"ID":"d156b1b1-0d6c-49bf-b188-773ff892fbd2","Type":"ContainerStarted","Data":"9e2dff4afa80b9b5348ffc8e3a4e02c28af0def80764e13faa1c00b6ac57780f"} Apr 28 19:16:54.390025 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:54.389936 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:16:54.390025 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:54.389997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:54.390033 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:54.390061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:54.390135 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:54.390179 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:54.390202 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-745bf7dd44-hft5k: secret "image-registry-tls" not found Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:54.390221 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert podName:7e4b5b64-ebf0-4ee2-a43b-35098459ff73 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:02.390197567 +0000 UTC m=+49.141747080 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-x865s" (UID: "7e4b5b64-ebf0-4ee2-a43b-35098459ff73") : secret "networking-console-plugin-cert" not found Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:54.390250 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls podName:9622feba-01a7-434b-84bb-f677851aaa37 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:02.390234891 +0000 UTC m=+49.141784385 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls") pod "image-registry-745bf7dd44-hft5k" (UID: "9622feba-01a7-434b-84bb-f677851aaa37") : secret "image-registry-tls" not found Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:54.390179 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:54.390283 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert podName:0d073d08-7217-4136-8485-03d574acfc52 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:02.390274505 +0000 UTC m=+49.141824019 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert") pod "ingress-canary-l8dqg" (UID: "0d073d08-7217-4136-8485-03d574acfc52") : secret "canary-serving-cert" not found Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:54.390179 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:54.390481 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:16:54.390318 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls podName:801789ef-975e-451d-9e18-0cb9acd739d6 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:02.390309821 +0000 UTC m=+49.141859313 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls") pod "dns-default-98b4d" (UID: "801789ef-975e-451d-9e18-0cb9acd739d6") : secret "dns-default-metrics-tls" not found Apr 28 19:16:58.059725 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:58.059685 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z27h7" event={"ID":"d156b1b1-0d6c-49bf-b188-773ff892fbd2","Type":"ContainerStarted","Data":"d91de6dfccbd55cf0a11612c7d5af691730705b25470e0be87605cebfefe4fb3"} Apr 28 19:16:58.084586 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:16:58.084536 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-z27h7" podStartSLOduration=33.594114784 podStartE2EDuration="37.084520215s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:16:53.637460533 +0000 UTC m=+40.389010026" lastFinishedPulling="2026-04-28 19:16:57.12786596 +0000 UTC m=+43.879415457" observedRunningTime="2026-04-28 19:16:58.083638746 +0000 UTC m=+44.835188274" watchObservedRunningTime="2026-04-28 19:16:58.084520215 +0000 UTC m=+44.836069729" Apr 28 19:17:02.453214 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:02.453174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:02.453225 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:02.453283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:02.453315 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:02.453329 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:02.453371 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:02.453388 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert podName:0d073d08-7217-4136-8485-03d574acfc52 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:18.453365881 +0000 UTC m=+65.204915379 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert") pod "ingress-canary-l8dqg" (UID: "0d073d08-7217-4136-8485-03d574acfc52") : secret "canary-serving-cert" not found Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:02.453388 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-745bf7dd44-hft5k: secret "image-registry-tls" not found Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:02.453423 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls podName:9622feba-01a7-434b-84bb-f677851aaa37 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:18.45341568 +0000 UTC m=+65.204965173 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls") pod "image-registry-745bf7dd44-hft5k" (UID: "9622feba-01a7-434b-84bb-f677851aaa37") : secret "image-registry-tls" not found Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:02.453441 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:02.453480 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:02.453493 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls podName:801789ef-975e-451d-9e18-0cb9acd739d6 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:18.453477067 +0000 UTC m=+65.205026562 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls") pod "dns-default-98b4d" (UID: "801789ef-975e-451d-9e18-0cb9acd739d6") : secret "dns-default-metrics-tls" not found Apr 28 19:17:02.453688 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:02.453539 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert podName:7e4b5b64-ebf0-4ee2-a43b-35098459ff73 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:18.453525711 +0000 UTC m=+65.205075203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-x865s" (UID: "7e4b5b64-ebf0-4ee2-a43b-35098459ff73") : secret "networking-console-plugin-cert" not found Apr 28 19:17:13.028625 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:13.028584 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xms2" Apr 28 19:17:14.316008 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.315971 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt"] Apr 28 19:17:14.320544 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.320526 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" Apr 28 19:17:14.324024 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.324002 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 28 19:17:14.324143 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.324069 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 28 19:17:14.325205 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.325188 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-tdf2r\"" Apr 28 19:17:14.325252 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.325217 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 28 19:17:14.325307 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.325192 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 28 19:17:14.337699 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.337675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn4q4\" (UniqueName: \"kubernetes.io/projected/8118492f-8ba9-4337-84c4-d4eb2f4bdd73-kube-api-access-dn4q4\") pod \"managed-serviceaccount-addon-agent-77bbcb797c-tmpdt\" (UID: \"8118492f-8ba9-4337-84c4-d4eb2f4bdd73\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" Apr 28 19:17:14.337841 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.337780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8118492f-8ba9-4337-84c4-d4eb2f4bdd73-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77bbcb797c-tmpdt\" (UID: \"8118492f-8ba9-4337-84c4-d4eb2f4bdd73\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" Apr 28 19:17:14.338522 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.338504 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt"] Apr 28 19:17:14.438995 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.438961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8118492f-8ba9-4337-84c4-d4eb2f4bdd73-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77bbcb797c-tmpdt\" (UID: \"8118492f-8ba9-4337-84c4-d4eb2f4bdd73\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" Apr 28 19:17:14.439192 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.439025 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn4q4\" (UniqueName: \"kubernetes.io/projected/8118492f-8ba9-4337-84c4-d4eb2f4bdd73-kube-api-access-dn4q4\") pod \"managed-serviceaccount-addon-agent-77bbcb797c-tmpdt\" (UID: \"8118492f-8ba9-4337-84c4-d4eb2f4bdd73\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" Apr 28 19:17:14.441394 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.441375 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8118492f-8ba9-4337-84c4-d4eb2f4bdd73-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77bbcb797c-tmpdt\" (UID: \"8118492f-8ba9-4337-84c4-d4eb2f4bdd73\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" Apr 28 19:17:14.450455 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.450430 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn4q4\" (UniqueName: \"kubernetes.io/projected/8118492f-8ba9-4337-84c4-d4eb2f4bdd73-kube-api-access-dn4q4\") pod \"managed-serviceaccount-addon-agent-77bbcb797c-tmpdt\" (UID: \"8118492f-8ba9-4337-84c4-d4eb2f4bdd73\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" Apr 28 19:17:14.640706 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.640591 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" Apr 28 19:17:14.751454 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:14.751419 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt"] Apr 28 19:17:14.754752 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:17:14.754725 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8118492f_8ba9_4337_84c4_d4eb2f4bdd73.slice/crio-4ad3ce4014bfcf2ca9a0a267cfab18623d3ed3f7080b2690b7fd41280bf6ab69 WatchSource:0}: Error finding container 4ad3ce4014bfcf2ca9a0a267cfab18623d3ed3f7080b2690b7fd41280bf6ab69: Status 404 returned error can't find the container with id 4ad3ce4014bfcf2ca9a0a267cfab18623d3ed3f7080b2690b7fd41280bf6ab69 Apr 28 19:17:15.095818 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:15.095784 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" event={"ID":"8118492f-8ba9-4337-84c4-d4eb2f4bdd73","Type":"ContainerStarted","Data":"4ad3ce4014bfcf2ca9a0a267cfab18623d3ed3f7080b2690b7fd41280bf6ab69"} Apr 28 19:17:18.474005 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:18.473967 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:17:18.474005 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:18.474009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:18.474034 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:18.474073 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:18.474114 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:18.474140 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:18.474166 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:18.474180 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls podName:801789ef-975e-451d-9e18-0cb9acd739d6 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:50.474164117 +0000 UTC m=+97.225713609 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls") pod "dns-default-98b4d" (UID: "801789ef-975e-451d-9e18-0cb9acd739d6") : secret "dns-default-metrics-tls" not found Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:18.474183 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:18.474199 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-745bf7dd44-hft5k: secret "image-registry-tls" not found Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:18.474203 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert podName:7e4b5b64-ebf0-4ee2-a43b-35098459ff73 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:50.474192508 +0000 UTC m=+97.225742001 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-x865s" (UID: "7e4b5b64-ebf0-4ee2-a43b-35098459ff73") : secret "networking-console-plugin-cert" not found Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:18.474217 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert podName:0d073d08-7217-4136-8485-03d574acfc52 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:50.47420975 +0000 UTC m=+97.225759243 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert") pod "ingress-canary-l8dqg" (UID: "0d073d08-7217-4136-8485-03d574acfc52") : secret "canary-serving-cert" not found Apr 28 19:17:18.474508 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:18.474242 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls podName:9622feba-01a7-434b-84bb-f677851aaa37 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:50.474229617 +0000 UTC m=+97.225779114 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls") pod "image-registry-745bf7dd44-hft5k" (UID: "9622feba-01a7-434b-84bb-f677851aaa37") : secret "image-registry-tls" not found Apr 28 19:17:19.104838 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:19.104804 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" event={"ID":"8118492f-8ba9-4337-84c4-d4eb2f4bdd73","Type":"ContainerStarted","Data":"cbdbcf6f005fd182689dbbb020b7a85e8c9dc5f59bc1d8c0360520a80ef518a1"} Apr 28 19:17:19.580919 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:19.580886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:17:19.581304 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:19.581057 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:17:19.581304 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:19.581137 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs podName:0b961ce3-ed85-40f4-840c-df0e74d830dd nodeName:}" failed. No retries permitted until 2026-04-28 19:18:23.581120916 +0000 UTC m=+130.332670409 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs") pod "network-metrics-daemon-qgcjb" (UID: "0b961ce3-ed85-40f4-840c-df0e74d830dd") : secret "metrics-daemon-secret" not found Apr 28 19:17:23.051957 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:23.051926 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nc646" Apr 28 19:17:23.078056 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:23.078001 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" podStartSLOduration=5.570783678 podStartE2EDuration="9.07798433s" podCreationTimestamp="2026-04-28 19:17:14 +0000 UTC" firstStartedPulling="2026-04-28 19:17:14.757161525 +0000 UTC m=+61.508711018" lastFinishedPulling="2026-04-28 19:17:18.264362175 +0000 UTC m=+65.015911670" observedRunningTime="2026-04-28 19:17:19.133974995 +0000 UTC m=+65.885524509" watchObservedRunningTime="2026-04-28 19:17:23.07798433 +0000 UTC m=+69.829533827" Apr 28 19:17:50.515200 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:50.515052 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:50.515211 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:50.515237 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-745bf7dd44-hft5k: secret "image-registry-tls" not found Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:50.515220 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:50.515285 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls podName:9622feba-01a7-434b-84bb-f677851aaa37 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:54.515270662 +0000 UTC m=+161.266820156 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls") pod "image-registry-745bf7dd44-hft5k" (UID: "9622feba-01a7-434b-84bb-f677851aaa37") : secret "image-registry-tls" not found Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:50.515307 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:50.515354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:50.515378 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert podName:7e4b5b64-ebf0-4ee2-a43b-35098459ff73 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:54.515361818 +0000 UTC m=+161.266911313 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-x865s" (UID: "7e4b5b64-ebf0-4ee2-a43b-35098459ff73") : secret "networking-console-plugin-cert" not found Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:50.515417 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:17:50.515422 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:50.515479 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls podName:801789ef-975e-451d-9e18-0cb9acd739d6 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:54.515463473 +0000 UTC m=+161.267012970 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls") pod "dns-default-98b4d" (UID: "801789ef-975e-451d-9e18-0cb9acd739d6") : secret "dns-default-metrics-tls" not found Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:50.515495 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:50.515764 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:17:50.515535 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert podName:0d073d08-7217-4136-8485-03d574acfc52 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:54.515526645 +0000 UTC m=+161.267076143 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert") pod "ingress-canary-l8dqg" (UID: "0d073d08-7217-4136-8485-03d574acfc52") : secret "canary-serving-cert" not found Apr 28 19:18:23.658393 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:23.658349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:18:23.658924 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:23.658481 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:18:23.658924 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:23.658545 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs podName:0b961ce3-ed85-40f4-840c-df0e74d830dd nodeName:}" failed. No retries permitted until 2026-04-28 19:20:25.658530537 +0000 UTC m=+252.410080030 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs") pod "network-metrics-daemon-qgcjb" (UID: "0b961ce3-ed85-40f4-840c-df0e74d830dd") : secret "metrics-daemon-secret" not found Apr 28 19:18:49.586128 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.586091 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc"] Apr 28 19:18:49.587931 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.587915 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc" Apr 28 19:18:49.590171 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.590146 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz"] Apr 28 19:18:49.590483 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.590461 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:49.590569 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.590482 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:49.590569 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.590520 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-vmz2l\"" Apr 28 19:18:49.592102 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.592086 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:49.594915 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.594886 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 28 19:18:49.595016 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.594913 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-6s9tg\"" Apr 28 19:18:49.595016 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.594937 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:49.595016 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.594981 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:49.596067 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.596044 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75"] Apr 28 19:18:49.597584 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.597564 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.602459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.602259 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:49.602459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.602273 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 28 19:18:49.602459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.602386 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 28 19:18:49.602677 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.602575 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-wbgjw\"" Apr 28 19:18:49.603027 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.603006 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-d599bf786-52gkb"] Apr 28 19:18:49.603220 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.603200 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:49.604820 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.604802 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc"] Apr 28 19:18:49.604897 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.604886 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.609582 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.609562 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 28 19:18:49.609689 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.609573 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 28 19:18:49.610522 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.610505 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 28 19:18:49.610836 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.610817 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 28 19:18:49.610909 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.610819 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 28 19:18:49.611134 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.611119 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-42ljl\"" Apr 28 19:18:49.611207 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.611188 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 28 19:18:49.611831 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.611813 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz"] Apr 28 19:18:49.621236 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.621210 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" podUID="7e4b5b64-ebf0-4ee2-a43b-35098459ff73" Apr 28 19:18:49.621316 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.621304 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75"] Apr 28 19:18:49.628678 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.628655 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" podUID="9622feba-01a7-434b-84bb-f677851aaa37" Apr 28 19:18:49.634068 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.634049 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d599bf786-52gkb"] Apr 28 19:18:49.636371 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.636345 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-98b4d" podUID="801789ef-975e-451d-9e18-0cb9acd739d6" Apr 28 19:18:49.645331 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.645306 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-l8dqg" podUID="0d073d08-7217-4136-8485-03d574acfc52" Apr 28 19:18:49.645427 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645356 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c847697-21da-48aa-9af0-aa0cf114d47c-config\") pod \"service-ca-operator-d6fc45fc5-6bn75\" (UID: \"1c847697-21da-48aa-9af0-aa0cf114d47c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.645427 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-default-certificate\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.645503 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645468 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfchn\" (UniqueName: \"kubernetes.io/projected/6934230f-b20b-48da-8237-e7583390094c-kube-api-access-kfchn\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:49.645542 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645516 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c847697-21da-48aa-9af0-aa0cf114d47c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-6bn75\" (UID: \"1c847697-21da-48aa-9af0-aa0cf114d47c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.645632 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645542 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvhsg\" (UniqueName: \"kubernetes.io/projected/1c847697-21da-48aa-9af0-aa0cf114d47c-kube-api-access-tvhsg\") pod \"service-ca-operator-d6fc45fc5-6bn75\" (UID: \"1c847697-21da-48aa-9af0-aa0cf114d47c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.645632 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645572 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:49.645632 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645589 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t99c\" (UniqueName: \"kubernetes.io/projected/c3b8f63d-3fda-4251-a306-6730ed6ac6d6-kube-api-access-7t99c\") pod \"volume-data-source-validator-7c6cbb6c87-cqswc\" (UID: \"c3b8f63d-3fda-4251-a306-6730ed6ac6d6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc" Apr 28 19:18:49.645787 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645633 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.645787 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645650 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.645787 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-stats-auth\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.645787 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.645709 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6rrm\" (UniqueName: \"kubernetes.io/projected/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-kube-api-access-f6rrm\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.746178 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:49.746178 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t99c\" (UniqueName: \"kubernetes.io/projected/c3b8f63d-3fda-4251-a306-6730ed6ac6d6-kube-api-access-7t99c\") pod \"volume-data-source-validator-7c6cbb6c87-cqswc\" (UID: \"c3b8f63d-3fda-4251-a306-6730ed6ac6d6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc" Apr 28 19:18:49.746383 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.746383 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746231 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.746383 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746257 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-stats-auth\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.746383 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746281 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6rrm\" (UniqueName: \"kubernetes.io/projected/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-kube-api-access-f6rrm\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.746383 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.746339 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:50.246317502 +0000 UTC m=+156.997867016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:49.746383 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.746357 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:49.746383 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.746378 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:49.746685 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.746427 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls podName:6934230f-b20b-48da-8237-e7583390094c nodeName:}" failed. No retries permitted until 2026-04-28 19:18:50.246409636 +0000 UTC m=+156.997959141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pp2lz" (UID: "6934230f-b20b-48da-8237-e7583390094c") : secret "samples-operator-tls" not found Apr 28 19:18:49.746685 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.746446 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:50.246435829 +0000 UTC m=+156.997985325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : secret "router-metrics-certs-default" not found Apr 28 19:18:49.746685 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746366 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c847697-21da-48aa-9af0-aa0cf114d47c-config\") pod \"service-ca-operator-d6fc45fc5-6bn75\" (UID: \"1c847697-21da-48aa-9af0-aa0cf114d47c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.746685 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746484 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-default-certificate\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.746685 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746555 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfchn\" (UniqueName: \"kubernetes.io/projected/6934230f-b20b-48da-8237-e7583390094c-kube-api-access-kfchn\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:49.746685 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746653 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c847697-21da-48aa-9af0-aa0cf114d47c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-6bn75\" (UID: \"1c847697-21da-48aa-9af0-aa0cf114d47c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.746913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.746689 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvhsg\" (UniqueName: \"kubernetes.io/projected/1c847697-21da-48aa-9af0-aa0cf114d47c-kube-api-access-tvhsg\") pod \"service-ca-operator-d6fc45fc5-6bn75\" (UID: \"1c847697-21da-48aa-9af0-aa0cf114d47c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.747872 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.747844 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c847697-21da-48aa-9af0-aa0cf114d47c-config\") pod \"service-ca-operator-d6fc45fc5-6bn75\" (UID: \"1c847697-21da-48aa-9af0-aa0cf114d47c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.749346 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.749316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-stats-auth\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.749651 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.749588 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c847697-21da-48aa-9af0-aa0cf114d47c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-6bn75\" (UID: \"1c847697-21da-48aa-9af0-aa0cf114d47c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.751867 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.751843 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-default-certificate\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.764584 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.764554 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t99c\" (UniqueName: \"kubernetes.io/projected/c3b8f63d-3fda-4251-a306-6730ed6ac6d6-kube-api-access-7t99c\") pod \"volume-data-source-validator-7c6cbb6c87-cqswc\" (UID: \"c3b8f63d-3fda-4251-a306-6730ed6ac6d6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc" Apr 28 19:18:49.768000 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.767973 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6rrm\" (UniqueName: \"kubernetes.io/projected/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-kube-api-access-f6rrm\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:49.768879 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.768857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvhsg\" (UniqueName: \"kubernetes.io/projected/1c847697-21da-48aa-9af0-aa0cf114d47c-kube-api-access-tvhsg\") pod \"service-ca-operator-d6fc45fc5-6bn75\" (UID: \"1c847697-21da-48aa-9af0-aa0cf114d47c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:49.769230 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.769211 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfchn\" (UniqueName: \"kubernetes.io/projected/6934230f-b20b-48da-8237-e7583390094c-kube-api-access-kfchn\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:49.897929 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.897849 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc" Apr 28 19:18:49.908789 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:49.908757 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qgcjb" podUID="0b961ce3-ed85-40f4-840c-df0e74d830dd" Apr 28 19:18:49.910835 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:49.910816 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" Apr 28 19:18:50.024491 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.024449 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc"] Apr 28 19:18:50.027212 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:18:50.027182 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3b8f63d_3fda_4251_a306_6730ed6ac6d6.slice/crio-8110ac91e80977063f727621f696e2a583ca2f5782d184bef05c9ecb3c1df2f5 WatchSource:0}: Error finding container 8110ac91e80977063f727621f696e2a583ca2f5782d184bef05c9ecb3c1df2f5: Status 404 returned error can't find the container with id 8110ac91e80977063f727621f696e2a583ca2f5782d184bef05c9ecb3c1df2f5 Apr 28 19:18:50.039873 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.039850 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75"] Apr 28 19:18:50.042513 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:18:50.042486 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c847697_21da_48aa_9af0_aa0cf114d47c.slice/crio-725c46d9e3ac3d6dc12a7cfb07087f1ae96c949845fbabad1ba4f23f7c9af369 WatchSource:0}: Error finding container 725c46d9e3ac3d6dc12a7cfb07087f1ae96c949845fbabad1ba4f23f7c9af369: Status 404 returned error can't find the container with id 725c46d9e3ac3d6dc12a7cfb07087f1ae96c949845fbabad1ba4f23f7c9af369 Apr 28 19:18:50.251398 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.251304 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:50.251398 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.251343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:50.251398 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.251369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:50.251632 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:50.251451 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:50.251632 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:50.251462 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:50.251632 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:50.251500 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:51.251486962 +0000 UTC m=+158.003036455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : secret "router-metrics-certs-default" not found Apr 28 19:18:50.251632 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:50.251514 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:51.2515078 +0000 UTC m=+158.003057293 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:50.251632 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:50.251524 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls podName:6934230f-b20b-48da-8237-e7583390094c nodeName:}" failed. No retries permitted until 2026-04-28 19:18:51.251518954 +0000 UTC m=+158.003068446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pp2lz" (UID: "6934230f-b20b-48da-8237-e7583390094c") : secret "samples-operator-tls" not found Apr 28 19:18:50.278525 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.278496 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" event={"ID":"1c847697-21da-48aa-9af0-aa0cf114d47c","Type":"ContainerStarted","Data":"725c46d9e3ac3d6dc12a7cfb07087f1ae96c949845fbabad1ba4f23f7c9af369"} Apr 28 19:18:50.279506 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.279487 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:18:50.279506 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.279496 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc" event={"ID":"c3b8f63d-3fda-4251-a306-6730ed6ac6d6","Type":"ContainerStarted","Data":"8110ac91e80977063f727621f696e2a583ca2f5782d184bef05c9ecb3c1df2f5"} Apr 28 19:18:50.279625 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.279512 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:18:50.279625 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.279518 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:18:50.279625 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:50.279599 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-98b4d" Apr 28 19:18:51.259767 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:51.259721 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:51.260272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:51.259783 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:51.260272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:51.259817 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:51.260272 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:51.259885 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:51.260272 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:51.259973 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:51.260272 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:51.259988 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls podName:6934230f-b20b-48da-8237-e7583390094c nodeName:}" failed. No retries permitted until 2026-04-28 19:18:53.259967381 +0000 UTC m=+160.011516877 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pp2lz" (UID: "6934230f-b20b-48da-8237-e7583390094c") : secret "samples-operator-tls" not found Apr 28 19:18:51.260272 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:51.260007 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:53.260000173 +0000 UTC m=+160.011549665 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:51.260272 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:51.260019 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:53.260012857 +0000 UTC m=+160.011562349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : secret "router-metrics-certs-default" not found Apr 28 19:18:52.284449 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:52.284412 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc" event={"ID":"c3b8f63d-3fda-4251-a306-6730ed6ac6d6","Type":"ContainerStarted","Data":"c38683b0cf294f8784841b2283e49ed0967b7fde3cf47da64342b7743d0bb93d"} Apr 28 19:18:52.308618 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:52.308546 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cqswc" podStartSLOduration=2.002381746 podStartE2EDuration="3.308533576s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="2026-04-28 19:18:50.029059846 +0000 UTC m=+156.780609340" lastFinishedPulling="2026-04-28 19:18:51.335211663 +0000 UTC m=+158.086761170" observedRunningTime="2026-04-28 19:18:52.307617349 +0000 UTC m=+159.059166872" watchObservedRunningTime="2026-04-28 19:18:52.308533576 +0000 UTC m=+159.060083145" Apr 28 19:18:53.276487 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.276447 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:53.276650 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.276503 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:53.276650 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.276535 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:53.276741 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:53.276638 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:53.276741 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:53.276695 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:57.276676664 +0000 UTC m=+164.028226157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:53.276741 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:53.276714 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls podName:6934230f-b20b-48da-8237-e7583390094c nodeName:}" failed. No retries permitted until 2026-04-28 19:18:57.276708219 +0000 UTC m=+164.028257712 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pp2lz" (UID: "6934230f-b20b-48da-8237-e7583390094c") : secret "samples-operator-tls" not found Apr 28 19:18:53.276741 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:53.276732 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:53.276907 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:53.276788 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:57.276772709 +0000 UTC m=+164.028322205 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : secret "router-metrics-certs-default" not found Apr 28 19:18:53.288403 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.288364 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" event={"ID":"1c847697-21da-48aa-9af0-aa0cf114d47c","Type":"ContainerStarted","Data":"fc9e12b1db509a19c62237b47590685b7f9bb70a9653768a9bc88589fe8749ba"} Apr 28 19:18:53.316312 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.316259 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" podStartSLOduration=1.6393956269999999 podStartE2EDuration="4.316247035s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="2026-04-28 19:18:50.044277509 +0000 UTC m=+156.795827002" lastFinishedPulling="2026-04-28 19:18:52.721128903 +0000 UTC m=+159.472678410" observedRunningTime="2026-04-28 19:18:53.314653439 +0000 UTC m=+160.066202948" watchObservedRunningTime="2026-04-28 19:18:53.316247035 +0000 UTC m=+160.067796546" Apr 28 19:18:53.804865 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.804828 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw"] Apr 28 19:18:53.806656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.806639 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw" Apr 28 19:18:53.809273 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.809253 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-9gr7k\"" Apr 28 19:18:53.824536 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.824510 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw"] Apr 28 19:18:53.882034 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.882005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdm2h\" (UniqueName: \"kubernetes.io/projected/25df0748-bd29-48db-925e-d566aa27fa14-kube-api-access-gdm2h\") pod \"network-check-source-8894fc9bd-8fxnw\" (UID: \"25df0748-bd29-48db-925e-d566aa27fa14\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw" Apr 28 19:18:53.983141 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:53.983098 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdm2h\" (UniqueName: \"kubernetes.io/projected/25df0748-bd29-48db-925e-d566aa27fa14-kube-api-access-gdm2h\") pod \"network-check-source-8894fc9bd-8fxnw\" (UID: \"25df0748-bd29-48db-925e-d566aa27fa14\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw" Apr 28 19:18:54.000750 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:54.000711 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdm2h\" (UniqueName: \"kubernetes.io/projected/25df0748-bd29-48db-925e-d566aa27fa14-kube-api-access-gdm2h\") pod \"network-check-source-8894fc9bd-8fxnw\" (UID: \"25df0748-bd29-48db-925e-d566aa27fa14\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw" Apr 28 19:18:54.119040 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:54.118955 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw" Apr 28 19:18:54.245391 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:54.245365 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw"] Apr 28 19:18:54.247737 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:18:54.247707 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25df0748_bd29_48db_925e_d566aa27fa14.slice/crio-e1eee86de733040488552cb72c84405c89878c687b00490d6447270922b2efe1 WatchSource:0}: Error finding container e1eee86de733040488552cb72c84405c89878c687b00490d6447270922b2efe1: Status 404 returned error can't find the container with id e1eee86de733040488552cb72c84405c89878c687b00490d6447270922b2efe1 Apr 28 19:18:54.291894 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:54.291867 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw" event={"ID":"25df0748-bd29-48db-925e-d566aa27fa14","Type":"ContainerStarted","Data":"e1eee86de733040488552cb72c84405c89878c687b00490d6447270922b2efe1"} Apr 28 19:18:54.588214 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:54.588169 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:18:54.588414 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:54.588236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:18:54.588414 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:54.588268 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:18:54.588414 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:54.588296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") pod \"image-registry-745bf7dd44-hft5k\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:18:54.588414 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:54.588317 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:18:54.588414 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:54.588368 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:18:54.588414 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:54.588397 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert podName:7e4b5b64-ebf0-4ee2-a43b-35098459ff73 nodeName:}" failed. No retries permitted until 2026-04-28 19:20:56.588375538 +0000 UTC m=+283.339925031 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-x865s" (UID: "7e4b5b64-ebf0-4ee2-a43b-35098459ff73") : secret "networking-console-plugin-cert" not found Apr 28 19:18:54.588414 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:54.588403 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:18:54.588730 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:54.588436 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:54.588730 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:54.588450 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-745bf7dd44-hft5k: secret "image-registry-tls" not found Apr 28 19:18:54.588730 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:54.588414 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls podName:801789ef-975e-451d-9e18-0cb9acd739d6 nodeName:}" failed. No retries permitted until 2026-04-28 19:20:56.588407115 +0000 UTC m=+283.339956608 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls") pod "dns-default-98b4d" (UID: "801789ef-975e-451d-9e18-0cb9acd739d6") : secret "dns-default-metrics-tls" not found Apr 28 19:18:54.588730 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:54.588479 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert podName:0d073d08-7217-4136-8485-03d574acfc52 nodeName:}" failed. No retries permitted until 2026-04-28 19:20:56.5884616 +0000 UTC m=+283.340011099 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert") pod "ingress-canary-l8dqg" (UID: "0d073d08-7217-4136-8485-03d574acfc52") : secret "canary-serving-cert" not found Apr 28 19:18:54.588730 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:54.588496 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls podName:9622feba-01a7-434b-84bb-f677851aaa37 nodeName:}" failed. No retries permitted until 2026-04-28 19:20:56.588485434 +0000 UTC m=+283.340034936 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls") pod "image-registry-745bf7dd44-hft5k" (UID: "9622feba-01a7-434b-84bb-f677851aaa37") : secret "image-registry-tls" not found Apr 28 19:18:55.296158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:55.296123 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw" event={"ID":"25df0748-bd29-48db-925e-d566aa27fa14","Type":"ContainerStarted","Data":"cb4ea53a4475979d32fed33daeb2d2dba62381cd45d254446a327853cda9895a"} Apr 28 19:18:55.319932 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:55.319877 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8fxnw" podStartSLOduration=2.319862113 podStartE2EDuration="2.319862113s" podCreationTimestamp="2026-04-28 19:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:18:55.319563402 +0000 UTC m=+162.071112917" watchObservedRunningTime="2026-04-28 19:18:55.319862113 +0000 UTC m=+162.071411627" Apr 28 19:18:56.420621 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:56.420588 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pmpq5_68ecdbce-5bdd-408b-bef6-91e797899886/dns-node-resolver/0.log" Apr 28 19:18:57.315188 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:57.315154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:18:57.315188 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:57.315190 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:57.315399 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:57.315212 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:18:57.315399 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:57.315324 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:19:05.315306444 +0000 UTC m=+172.066855936 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:57.315399 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:57.315324 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:57.315501 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:57.315424 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls podName:6934230f-b20b-48da-8237-e7583390094c nodeName:}" failed. No retries permitted until 2026-04-28 19:19:05.315409233 +0000 UTC m=+172.066958733 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pp2lz" (UID: "6934230f-b20b-48da-8237-e7583390094c") : secret "samples-operator-tls" not found Apr 28 19:18:57.315501 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:57.315329 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:57.315501 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:18:57.315460 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:19:05.315450921 +0000 UTC m=+172.067000418 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : secret "router-metrics-certs-default" not found Apr 28 19:18:57.421589 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:18:57.421565 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nt8t9_becc1ca0-0b4e-43d1-95d2-8979c6dd35fd/node-ca/0.log" Apr 28 19:19:03.884047 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:03.883971 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:19:05.386303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:05.386237 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:19:05.386303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:05.386308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:05.386847 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:05.386339 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:05.386847 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:19:05.386469 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle podName:78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a nodeName:}" failed. No retries permitted until 2026-04-28 19:19:21.386450546 +0000 UTC m=+188.138000039 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle") pod "router-default-d599bf786-52gkb" (UID: "78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a") : configmap references non-existent config key: service-ca.crt Apr 28 19:19:05.388657 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:05.388632 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-metrics-certs\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:05.388762 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:05.388746 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6934230f-b20b-48da-8237-e7583390094c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pp2lz\" (UID: \"6934230f-b20b-48da-8237-e7583390094c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:19:05.504688 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:05.504651 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" Apr 28 19:19:05.619085 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:05.618986 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz"] Apr 28 19:19:06.325158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:06.325116 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" event={"ID":"6934230f-b20b-48da-8237-e7583390094c","Type":"ContainerStarted","Data":"3e2191db242f53179539ef7e45016fc0b266c6eea9d7eab0d4efcd95890d19e3"} Apr 28 19:19:08.331940 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:08.331904 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" event={"ID":"6934230f-b20b-48da-8237-e7583390094c","Type":"ContainerStarted","Data":"ab641cc2bf85bf7e4e96141b3d328b9c523f2d5dc4437ade3bd4a399a84abefb"} Apr 28 19:19:08.331940 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:08.331942 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" event={"ID":"6934230f-b20b-48da-8237-e7583390094c","Type":"ContainerStarted","Data":"0b8ff286e5c14a837d339d25cb84dd2a7de4d29de81ffe6ee7a155fdd24c531b"} Apr 28 19:19:08.350201 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:08.350152 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pp2lz" podStartSLOduration=17.709901839 podStartE2EDuration="19.350138381s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="2026-04-28 19:19:05.664850243 +0000 UTC m=+172.416399736" lastFinishedPulling="2026-04-28 19:19:07.305086782 +0000 UTC m=+174.056636278" observedRunningTime="2026-04-28 19:19:08.350089108 +0000 UTC m=+175.101638660" watchObservedRunningTime="2026-04-28 19:19:08.350138381 +0000 UTC m=+175.101687895" Apr 28 19:19:17.562753 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.562715 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6t8k2"] Apr 28 19:19:17.564917 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.564900 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.576736 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.576717 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-v5snm\"" Apr 28 19:19:17.577682 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.577666 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:19:17.577682 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.577677 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:19:17.577776 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.577696 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:19:17.577776 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.577738 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:19:17.588531 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.588509 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6t8k2"] Apr 28 19:19:17.687745 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.687705 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6a46589-6d43-4418-acea-0a5d676870f3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.687904 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.687760 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e6a46589-6d43-4418-acea-0a5d676870f3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.687904 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.687815 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw8cz\" (UniqueName: \"kubernetes.io/projected/e6a46589-6d43-4418-acea-0a5d676870f3-kube-api-access-sw8cz\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.687904 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.687865 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e6a46589-6d43-4418-acea-0a5d676870f3-data-volume\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.687904 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.687900 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e6a46589-6d43-4418-acea-0a5d676870f3-crio-socket\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.789165 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.789128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6a46589-6d43-4418-acea-0a5d676870f3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.789344 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.789180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e6a46589-6d43-4418-acea-0a5d676870f3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.789344 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.789277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw8cz\" (UniqueName: \"kubernetes.io/projected/e6a46589-6d43-4418-acea-0a5d676870f3-kube-api-access-sw8cz\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.789344 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.789335 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e6a46589-6d43-4418-acea-0a5d676870f3-data-volume\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.789522 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.789360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e6a46589-6d43-4418-acea-0a5d676870f3-crio-socket\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.789522 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.789460 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e6a46589-6d43-4418-acea-0a5d676870f3-crio-socket\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.789801 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.789780 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e6a46589-6d43-4418-acea-0a5d676870f3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.790233 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.790218 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e6a46589-6d43-4418-acea-0a5d676870f3-data-volume\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.791600 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.791577 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6a46589-6d43-4418-acea-0a5d676870f3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.801925 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.801898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw8cz\" (UniqueName: \"kubernetes.io/projected/e6a46589-6d43-4418-acea-0a5d676870f3-kube-api-access-sw8cz\") pod \"insights-runtime-extractor-6t8k2\" (UID: \"e6a46589-6d43-4418-acea-0a5d676870f3\") " pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.873642 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.873512 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6t8k2" Apr 28 19:19:17.992677 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:17.992654 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6t8k2"] Apr 28 19:19:17.995187 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:19:17.995154 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6a46589_6d43_4418_acea_0a5d676870f3.slice/crio-509ae9c884bb0d4afa5654ac0530cf067be55d2b3f867f361fb7b313ec1a4c32 WatchSource:0}: Error finding container 509ae9c884bb0d4afa5654ac0530cf067be55d2b3f867f361fb7b313ec1a4c32: Status 404 returned error can't find the container with id 509ae9c884bb0d4afa5654ac0530cf067be55d2b3f867f361fb7b313ec1a4c32 Apr 28 19:19:18.356916 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:18.356883 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6t8k2" event={"ID":"e6a46589-6d43-4418-acea-0a5d676870f3","Type":"ContainerStarted","Data":"b7e1719690bd9af6cd29dd75012cff3916f2cc374fffc332a2bc8dd0db691d41"} Apr 28 19:19:18.356916 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:18.356919 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6t8k2" event={"ID":"e6a46589-6d43-4418-acea-0a5d676870f3","Type":"ContainerStarted","Data":"509ae9c884bb0d4afa5654ac0530cf067be55d2b3f867f361fb7b313ec1a4c32"} Apr 28 19:19:19.360834 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:19.360803 2570 generic.go:358] "Generic (PLEG): container finished" podID="8118492f-8ba9-4337-84c4-d4eb2f4bdd73" containerID="cbdbcf6f005fd182689dbbb020b7a85e8c9dc5f59bc1d8c0360520a80ef518a1" exitCode=255 Apr 28 19:19:19.361287 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:19.360883 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" event={"ID":"8118492f-8ba9-4337-84c4-d4eb2f4bdd73","Type":"ContainerDied","Data":"cbdbcf6f005fd182689dbbb020b7a85e8c9dc5f59bc1d8c0360520a80ef518a1"} Apr 28 19:19:19.362617 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:19.362571 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6t8k2" event={"ID":"e6a46589-6d43-4418-acea-0a5d676870f3","Type":"ContainerStarted","Data":"0511cbf6924b819994a86ba39b2318fe52e0d643293ce9c214d40313e1a6dd8b"} Apr 28 19:19:19.368064 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:19.368035 2570 scope.go:117] "RemoveContainer" containerID="cbdbcf6f005fd182689dbbb020b7a85e8c9dc5f59bc1d8c0360520a80ef518a1" Apr 28 19:19:20.366948 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:20.366911 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6t8k2" event={"ID":"e6a46589-6d43-4418-acea-0a5d676870f3","Type":"ContainerStarted","Data":"4a9230049a906f7a1441307a7185303793e3e2bb8a99db8162d1f9186caaff1e"} Apr 28 19:19:20.368289 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:20.368266 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77bbcb797c-tmpdt" event={"ID":"8118492f-8ba9-4337-84c4-d4eb2f4bdd73","Type":"ContainerStarted","Data":"7b70b46d1e4187bd756b8de7e0ceb2fb7073d44bfabfdc1b59a9382cba831dfa"} Apr 28 19:19:20.393188 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:20.393141 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6t8k2" podStartSLOduration=1.598728586 podStartE2EDuration="3.393127724s" podCreationTimestamp="2026-04-28 19:19:17 +0000 UTC" firstStartedPulling="2026-04-28 19:19:18.056239927 +0000 UTC m=+184.807789421" lastFinishedPulling="2026-04-28 19:19:19.850639064 +0000 UTC m=+186.602188559" observedRunningTime="2026-04-28 19:19:20.392004303 +0000 UTC m=+187.143553815" watchObservedRunningTime="2026-04-28 19:19:20.393127724 +0000 UTC m=+187.144677293" Apr 28 19:19:21.419661 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:21.419589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:21.420781 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:21.420759 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a-service-ca-bundle\") pod \"router-default-d599bf786-52gkb\" (UID: \"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a\") " pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:21.719307 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:21.719219 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-42ljl\"" Apr 28 19:19:21.727618 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:21.727586 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:21.882704 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:19:21.882669 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78cae5ab_43f1_4ebe_bc59_bac6aeca7b9a.slice/crio-d1444720a87547bd1ac832a2f115af0bd03c9976ccc43381b13d865dbbe33c31 WatchSource:0}: Error finding container d1444720a87547bd1ac832a2f115af0bd03c9976ccc43381b13d865dbbe33c31: Status 404 returned error can't find the container with id d1444720a87547bd1ac832a2f115af0bd03c9976ccc43381b13d865dbbe33c31 Apr 28 19:19:21.886313 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:21.886287 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d599bf786-52gkb"] Apr 28 19:19:22.374446 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:22.374403 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d599bf786-52gkb" event={"ID":"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a","Type":"ContainerStarted","Data":"8697f22ac800401893ef822933a2afe562d7cf748819617ac0bacb144a09cdcf"} Apr 28 19:19:22.374446 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:22.374450 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d599bf786-52gkb" event={"ID":"78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a","Type":"ContainerStarted","Data":"d1444720a87547bd1ac832a2f115af0bd03c9976ccc43381b13d865dbbe33c31"} Apr 28 19:19:22.411375 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:22.411300 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-d599bf786-52gkb" podStartSLOduration=33.411286239 podStartE2EDuration="33.411286239s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:19:22.411170121 +0000 UTC m=+189.162719647" watchObservedRunningTime="2026-04-28 19:19:22.411286239 +0000 UTC m=+189.162835753" Apr 28 19:19:22.727926 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:22.727822 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:22.730391 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:22.730365 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:23.376990 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:23.376956 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:23.378257 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:23.378234 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-d599bf786-52gkb" Apr 28 19:19:29.051497 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.051468 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c6b857bd4-zkbhf"] Apr 28 19:19:29.054354 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.054336 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.057641 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.057595 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 28 19:19:29.058214 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.058197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-c9kx6\"" Apr 28 19:19:29.058696 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.058672 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 28 19:19:29.058827 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.058712 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 28 19:19:29.058827 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.058759 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 28 19:19:29.058827 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.058776 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 28 19:19:29.059251 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.059235 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 28 19:19:29.059340 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.059271 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 28 19:19:29.064057 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.064036 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 28 19:19:29.068644 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.068622 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6b857bd4-zkbhf"] Apr 28 19:19:29.076739 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.076714 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-oauth-serving-cert\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.076853 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.076747 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-console-config\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.076853 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.076772 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-trusted-ca-bundle\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.076935 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.076909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-service-ca\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.077022 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.077005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-serving-cert\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.077057 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.077034 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlbdp\" (UniqueName: \"kubernetes.io/projected/5ef5b7c5-142f-4343-9493-0b064c362185-kube-api-access-mlbdp\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.077092 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.077085 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-oauth-config\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.177849 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.177812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-service-ca\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.178051 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.177889 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-serving-cert\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.178051 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.177913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlbdp\" (UniqueName: \"kubernetes.io/projected/5ef5b7c5-142f-4343-9493-0b064c362185-kube-api-access-mlbdp\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.178051 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.177964 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-oauth-config\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.178217 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.178148 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-oauth-serving-cert\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.178217 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.178183 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-console-config\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.178217 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.178207 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-trusted-ca-bundle\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.178628 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.178583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-service-ca\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.178882 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.178857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-oauth-serving-cert\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.178929 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.178915 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-console-config\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.179187 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.179169 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-trusted-ca-bundle\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.180352 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.180329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-oauth-config\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.180456 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.180440 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-serving-cert\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.187230 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.187205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlbdp\" (UniqueName: \"kubernetes.io/projected/5ef5b7c5-142f-4343-9493-0b064c362185-kube-api-access-mlbdp\") pod \"console-5c6b857bd4-zkbhf\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.363871 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.363772 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:29.481960 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:29.481935 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6b857bd4-zkbhf"] Apr 28 19:19:29.484265 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:19:29.484235 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef5b7c5_142f_4343_9493_0b064c362185.slice/crio-dcf3145ba0ae32d3d54a8ad486c0d6a7545e6573db6b7ee556892715be4b21e2 WatchSource:0}: Error finding container dcf3145ba0ae32d3d54a8ad486c0d6a7545e6573db6b7ee556892715be4b21e2: Status 404 returned error can't find the container with id dcf3145ba0ae32d3d54a8ad486c0d6a7545e6573db6b7ee556892715be4b21e2 Apr 28 19:19:30.394434 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:30.394393 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6b857bd4-zkbhf" event={"ID":"5ef5b7c5-142f-4343-9493-0b064c362185","Type":"ContainerStarted","Data":"dcf3145ba0ae32d3d54a8ad486c0d6a7545e6573db6b7ee556892715be4b21e2"} Apr 28 19:19:32.093578 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.093543 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-x7nz2"] Apr 28 19:19:32.095715 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.095685 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.096285 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.096253 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9zbws"] Apr 28 19:19:32.098121 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.098104 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.098973 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.098952 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:19:32.099176 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.099153 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 28 19:19:32.099290 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.099182 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:19:32.099290 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.099197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-rbrnw\"" Apr 28 19:19:32.100215 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.100196 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 28 19:19:32.100323 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.100216 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:19:32.100323 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.100284 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 28 19:19:32.100800 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.100785 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:19:32.101169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.101153 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4l8px\"" Apr 28 19:19:32.101242 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.101194 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:19:32.101676 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.101661 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:19:32.117781 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.117757 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-x7nz2"] Apr 28 19:19:32.203903 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.203875 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-sys\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.204053 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.203912 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.204053 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.203946 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.204053 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204039 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-root\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.204153 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204077 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.204153 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204125 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-textfile\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.204219 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204153 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.204219 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-wtmp\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.204276 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204255 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.204311 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204289 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9fpm\" (UniqueName: \"kubernetes.io/projected/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-kube-api-access-f9fpm\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.204343 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204327 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-metrics-client-ca\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.204376 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204343 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-accelerators-collector-config\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.204376 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204361 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dlp9\" (UniqueName: \"kubernetes.io/projected/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-api-access-6dlp9\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.204445 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204385 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-tls\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.204445 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.204416 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.305168 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305133 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-textfile\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305168 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.305388 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305217 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-wtmp\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305388 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305354 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-wtmp\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305488 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305398 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.305488 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305453 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-textfile\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305488 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9fpm\" (UniqueName: \"kubernetes.io/projected/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-kube-api-access-f9fpm\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305596 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-metrics-client-ca\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305596 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305538 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-accelerators-collector-config\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305596 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305555 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dlp9\" (UniqueName: \"kubernetes.io/projected/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-api-access-6dlp9\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.305596 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305573 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-tls\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305840 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305598 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.305840 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305672 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-sys\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305840 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305706 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.305840 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.305840 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305775 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-root\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.305840 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.305799 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.306125 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:19:32.305965 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 28 19:19:32.306125 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:19:32.306018 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-tls podName:4a04ffd3-70ad-4ce7-ab98-6e69013d5119 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:32.805999426 +0000 UTC m=+199.557548938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-tls") pod "node-exporter-9zbws" (UID: "4a04ffd3-70ad-4ce7-ab98-6e69013d5119") : secret "node-exporter-tls" not found Apr 28 19:19:32.306237 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.306137 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-root\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.306237 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.306190 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-accelerators-collector-config\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.306237 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.306195 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-sys\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.306347 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.306250 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.306477 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.306455 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.306722 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.306699 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.306850 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.306827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-metrics-client-ca\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.307977 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.307954 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.308073 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.308043 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.308507 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.308489 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.314728 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.314710 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dlp9\" (UniqueName: \"kubernetes.io/projected/58c6830d-7e4f-4ef7-98a1-4f0f5d45500d-kube-api-access-6dlp9\") pod \"kube-state-metrics-69db897b98-x7nz2\" (UID: \"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.314818 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.314802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9fpm\" (UniqueName: \"kubernetes.io/projected/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-kube-api-access-f9fpm\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.400910 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.400829 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6b857bd4-zkbhf" event={"ID":"5ef5b7c5-142f-4343-9493-0b064c362185","Type":"ContainerStarted","Data":"960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3"} Apr 28 19:19:32.407306 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.407282 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" Apr 28 19:19:32.435871 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.435824 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c6b857bd4-zkbhf" podStartSLOduration=0.803741402 podStartE2EDuration="3.435810449s" podCreationTimestamp="2026-04-28 19:19:29 +0000 UTC" firstStartedPulling="2026-04-28 19:19:29.486087305 +0000 UTC m=+196.237636799" lastFinishedPulling="2026-04-28 19:19:32.118156353 +0000 UTC m=+198.869705846" observedRunningTime="2026-04-28 19:19:32.435308202 +0000 UTC m=+199.186857715" watchObservedRunningTime="2026-04-28 19:19:32.435810449 +0000 UTC m=+199.187359964" Apr 28 19:19:32.537836 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.537796 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-x7nz2"] Apr 28 19:19:32.541229 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:19:32.541200 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c6830d_7e4f_4ef7_98a1_4f0f5d45500d.slice/crio-511ceacf5ebf355130415c69048e01d30db1fb64ba11e464ebfa4286b3753ed0 WatchSource:0}: Error finding container 511ceacf5ebf355130415c69048e01d30db1fb64ba11e464ebfa4286b3753ed0: Status 404 returned error can't find the container with id 511ceacf5ebf355130415c69048e01d30db1fb64ba11e464ebfa4286b3753ed0 Apr 28 19:19:32.809635 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.809573 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-tls\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:32.811776 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:32.811754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a04ffd3-70ad-4ce7-ab98-6e69013d5119-node-exporter-tls\") pod \"node-exporter-9zbws\" (UID: \"4a04ffd3-70ad-4ce7-ab98-6e69013d5119\") " pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:33.013625 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.013335 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9zbws" Apr 28 19:19:33.022675 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:19:33.022637 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a04ffd3_70ad_4ce7_ab98_6e69013d5119.slice/crio-8b9e23db7fa55cb2cb94176a6913c0f3be828e7b867ebcf09fb269ed2cabe9cc WatchSource:0}: Error finding container 8b9e23db7fa55cb2cb94176a6913c0f3be828e7b867ebcf09fb269ed2cabe9cc: Status 404 returned error can't find the container with id 8b9e23db7fa55cb2cb94176a6913c0f3be828e7b867ebcf09fb269ed2cabe9cc Apr 28 19:19:33.285888 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.285851 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:19:33.289623 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.289591 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.292741 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.292719 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 28 19:19:33.292846 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.292741 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 28 19:19:33.293946 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.293913 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 28 19:19:33.294142 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.294118 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 28 19:19:33.294237 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.294159 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 28 19:19:33.294237 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.294211 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 28 19:19:33.294353 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.294303 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-q592p\"" Apr 28 19:19:33.294630 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.294599 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 28 19:19:33.294713 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.294696 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 28 19:19:33.294978 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.294958 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 28 19:19:33.309418 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.309373 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:19:33.314096 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.314064 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.314215 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.314129 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d2bf842-725c-468d-abc9-ea477e7ed9e8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.315678 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.315652 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-web-config\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.315907 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.315874 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-config-volume\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.316075 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.316059 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.316203 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.316188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.316326 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.316297 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.316460 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.316447 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.316646 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.316630 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0d2bf842-725c-468d-abc9-ea477e7ed9e8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.316784 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.316754 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxmfr\" (UniqueName: \"kubernetes.io/projected/0d2bf842-725c-468d-abc9-ea477e7ed9e8-kube-api-access-fxmfr\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.316913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.316900 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d2bf842-725c-468d-abc9-ea477e7ed9e8-config-out\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.317018 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.317005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d2bf842-725c-468d-abc9-ea477e7ed9e8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.317163 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.317131 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d2bf842-725c-468d-abc9-ea477e7ed9e8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.405620 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.405566 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9zbws" event={"ID":"4a04ffd3-70ad-4ce7-ab98-6e69013d5119","Type":"ContainerStarted","Data":"8b9e23db7fa55cb2cb94176a6913c0f3be828e7b867ebcf09fb269ed2cabe9cc"} Apr 28 19:19:33.406852 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.406810 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" event={"ID":"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d","Type":"ContainerStarted","Data":"511ceacf5ebf355130415c69048e01d30db1fb64ba11e464ebfa4286b3753ed0"} Apr 28 19:19:33.418717 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.418688 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.418844 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.418737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d2bf842-725c-468d-abc9-ea477e7ed9e8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.418844 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.418769 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-web-config\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.418844 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.418801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-config-volume\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.418844 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.418827 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.419055 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.418864 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.419055 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.418892 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.419055 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.418937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.419055 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.418994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0d2bf842-725c-468d-abc9-ea477e7ed9e8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.419055 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.419018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxmfr\" (UniqueName: \"kubernetes.io/projected/0d2bf842-725c-468d-abc9-ea477e7ed9e8-kube-api-access-fxmfr\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.419327 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.419055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d2bf842-725c-468d-abc9-ea477e7ed9e8-config-out\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.419327 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.419083 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d2bf842-725c-468d-abc9-ea477e7ed9e8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.419327 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.419126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d2bf842-725c-468d-abc9-ea477e7ed9e8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.419327 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:19:33.419297 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d2bf842-725c-468d-abc9-ea477e7ed9e8-alertmanager-trusted-ca-bundle podName:0d2bf842-725c-468d-abc9-ea477e7ed9e8 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:33.919278227 +0000 UTC m=+200.670827727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/0d2bf842-725c-468d-abc9-ea477e7ed9e8-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "0d2bf842-725c-468d-abc9-ea477e7ed9e8") : configmap references non-existent config key: ca-bundle.crt Apr 28 19:19:33.421257 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.420894 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d2bf842-725c-468d-abc9-ea477e7ed9e8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.421257 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.421190 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0d2bf842-725c-468d-abc9-ea477e7ed9e8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.423165 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.423125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.423751 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.423730 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.424215 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.424175 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.424529 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.424494 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-config-volume\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.425131 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.425103 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d2bf842-725c-468d-abc9-ea477e7ed9e8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.425221 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.425145 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.425650 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.425615 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-web-config\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.426488 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.426450 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d2bf842-725c-468d-abc9-ea477e7ed9e8-config-out\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.426870 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.426847 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0d2bf842-725c-468d-abc9-ea477e7ed9e8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.430503 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.430462 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxmfr\" (UniqueName: \"kubernetes.io/projected/0d2bf842-725c-468d-abc9-ea477e7ed9e8-kube-api-access-fxmfr\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.924471 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.924448 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d2bf842-725c-468d-abc9-ea477e7ed9e8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:33.925437 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:33.925393 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d2bf842-725c-468d-abc9-ea477e7ed9e8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0d2bf842-725c-468d-abc9-ea477e7ed9e8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:34.202547 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:34.202502 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:19:34.355469 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:34.355438 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:19:34.357016 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:19:34.356993 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d2bf842_725c_468d_abc9_ea477e7ed9e8.slice/crio-f3ab6531c114e86b96d98ca5c0190ba3e2fb90fd684af6509a270b88776b2c1b WatchSource:0}: Error finding container f3ab6531c114e86b96d98ca5c0190ba3e2fb90fd684af6509a270b88776b2c1b: Status 404 returned error can't find the container with id f3ab6531c114e86b96d98ca5c0190ba3e2fb90fd684af6509a270b88776b2c1b Apr 28 19:19:34.411533 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:34.411443 2570 generic.go:358] "Generic (PLEG): container finished" podID="4a04ffd3-70ad-4ce7-ab98-6e69013d5119" containerID="0854a97ee4e15834631cc57eeb227b7b6e6883dd5eadea503f8ccf00a4262c89" exitCode=0 Apr 28 19:19:34.411533 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:34.411519 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9zbws" event={"ID":"4a04ffd3-70ad-4ce7-ab98-6e69013d5119","Type":"ContainerDied","Data":"0854a97ee4e15834631cc57eeb227b7b6e6883dd5eadea503f8ccf00a4262c89"} Apr 28 19:19:34.413433 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:34.413396 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" event={"ID":"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d","Type":"ContainerStarted","Data":"bc995c10eef73fb72bf1c2ceac9db42bac3bee25937c352ee8820706fcf8d194"} Apr 28 19:19:34.413433 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:34.413432 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" event={"ID":"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d","Type":"ContainerStarted","Data":"69faed28380cdfa2d5909d15806e008d8a90aa17a8c04d3cfb1a457737c5a2d1"} Apr 28 19:19:34.413653 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:34.413445 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" event={"ID":"58c6830d-7e4f-4ef7-98a1-4f0f5d45500d","Type":"ContainerStarted","Data":"1192b71186fb5cf44d37f7de02715f0992ce0da925ebb009efe82cf9e36e382e"} Apr 28 19:19:34.414533 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:34.414499 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d2bf842-725c-468d-abc9-ea477e7ed9e8","Type":"ContainerStarted","Data":"f3ab6531c114e86b96d98ca5c0190ba3e2fb90fd684af6509a270b88776b2c1b"} Apr 28 19:19:34.478518 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:34.478478 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-x7nz2" podStartSLOduration=1.211017725 podStartE2EDuration="2.478464469s" podCreationTimestamp="2026-04-28 19:19:32 +0000 UTC" firstStartedPulling="2026-04-28 19:19:32.543469007 +0000 UTC m=+199.295018500" lastFinishedPulling="2026-04-28 19:19:33.810915741 +0000 UTC m=+200.562465244" observedRunningTime="2026-04-28 19:19:34.478268088 +0000 UTC m=+201.229817604" watchObservedRunningTime="2026-04-28 19:19:34.478464469 +0000 UTC m=+201.230013983" Apr 28 19:19:35.418490 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:35.418462 2570 generic.go:358] "Generic (PLEG): container finished" podID="0d2bf842-725c-468d-abc9-ea477e7ed9e8" containerID="e2ac9cbeec687491b85b7feefe7a0252296a38e341ad404727235d3d3c10f779" exitCode=0 Apr 28 19:19:35.418866 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:35.418546 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d2bf842-725c-468d-abc9-ea477e7ed9e8","Type":"ContainerDied","Data":"e2ac9cbeec687491b85b7feefe7a0252296a38e341ad404727235d3d3c10f779"} Apr 28 19:19:35.420502 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:35.420478 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9zbws" event={"ID":"4a04ffd3-70ad-4ce7-ab98-6e69013d5119","Type":"ContainerStarted","Data":"5e214106a8ea6be4a9611d63d41b25b9dbe7a7342398ca8a27adf6131b7c8f04"} Apr 28 19:19:35.420577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:35.420511 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9zbws" event={"ID":"4a04ffd3-70ad-4ce7-ab98-6e69013d5119","Type":"ContainerStarted","Data":"2dbeea0bdf87fbf2e745d65f02c6e49f53730b6f0e179c46b7da51affc739224"} Apr 28 19:19:35.482283 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:35.482233 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9zbws" podStartSLOduration=2.6950326589999998 podStartE2EDuration="3.482219958s" podCreationTimestamp="2026-04-28 19:19:32 +0000 UTC" firstStartedPulling="2026-04-28 19:19:33.024650774 +0000 UTC m=+199.776200271" lastFinishedPulling="2026-04-28 19:19:33.811838074 +0000 UTC m=+200.563387570" observedRunningTime="2026-04-28 19:19:35.481077917 +0000 UTC m=+202.232627473" watchObservedRunningTime="2026-04-28 19:19:35.482219958 +0000 UTC m=+202.233769473" Apr 28 19:19:36.529415 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.529379 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-64d997c947-qw75t"] Apr 28 19:19:36.531715 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.531690 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.535593 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.535546 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 28 19:19:36.535593 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.535589 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-h44gx\"" Apr 28 19:19:36.535809 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.535598 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 28 19:19:36.535809 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.535555 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 28 19:19:36.535809 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.535552 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-6ah4fbgqkgst6\"" Apr 28 19:19:36.535809 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.535623 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 28 19:19:36.544895 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.544874 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64d997c947-qw75t"] Apr 28 19:19:36.649836 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.649801 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0b855502-4514-43b3-83c4-664b35b6bddc-metrics-server-audit-profiles\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.649997 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.649849 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0b855502-4514-43b3-83c4-664b35b6bddc-secret-metrics-server-client-certs\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.649997 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.649881 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b855502-4514-43b3-83c4-664b35b6bddc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.650109 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.649986 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0b855502-4514-43b3-83c4-664b35b6bddc-secret-metrics-server-tls\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.650109 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.650058 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b855502-4514-43b3-83c4-664b35b6bddc-client-ca-bundle\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.650109 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.650098 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0b855502-4514-43b3-83c4-664b35b6bddc-audit-log\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.650221 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.650125 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2s7h\" (UniqueName: \"kubernetes.io/projected/0b855502-4514-43b3-83c4-664b35b6bddc-kube-api-access-c2s7h\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.750996 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.750915 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b855502-4514-43b3-83c4-664b35b6bddc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.750996 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.750979 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0b855502-4514-43b3-83c4-664b35b6bddc-secret-metrics-server-tls\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.751228 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.751025 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b855502-4514-43b3-83c4-664b35b6bddc-client-ca-bundle\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.751228 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.751062 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0b855502-4514-43b3-83c4-664b35b6bddc-audit-log\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.751228 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.751078 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2s7h\" (UniqueName: \"kubernetes.io/projected/0b855502-4514-43b3-83c4-664b35b6bddc-kube-api-access-c2s7h\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.751228 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.751186 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0b855502-4514-43b3-83c4-664b35b6bddc-metrics-server-audit-profiles\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.751228 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.751217 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0b855502-4514-43b3-83c4-664b35b6bddc-secret-metrics-server-client-certs\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.751695 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.751639 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b855502-4514-43b3-83c4-664b35b6bddc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.751867 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.751840 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0b855502-4514-43b3-83c4-664b35b6bddc-audit-log\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.752594 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.752560 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0b855502-4514-43b3-83c4-664b35b6bddc-metrics-server-audit-profiles\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.754122 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.754098 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b855502-4514-43b3-83c4-664b35b6bddc-client-ca-bundle\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.754252 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.754229 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0b855502-4514-43b3-83c4-664b35b6bddc-secret-metrics-server-client-certs\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.754827 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.754807 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0b855502-4514-43b3-83c4-664b35b6bddc-secret-metrics-server-tls\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.769894 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.769874 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2s7h\" (UniqueName: \"kubernetes.io/projected/0b855502-4514-43b3-83c4-664b35b6bddc-kube-api-access-c2s7h\") pod \"metrics-server-64d997c947-qw75t\" (UID: \"0b855502-4514-43b3-83c4-664b35b6bddc\") " pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:36.842322 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:36.842242 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:37.055069 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.055012 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6"] Apr 28 19:19:37.057966 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.057940 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" Apr 28 19:19:37.061141 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.061114 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-p9bwg\"" Apr 28 19:19:37.061241 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.061148 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 28 19:19:37.077301 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.077272 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6"] Apr 28 19:19:37.117345 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.117316 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64d997c947-qw75t"] Apr 28 19:19:37.118957 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:19:37.118869 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b855502_4514_43b3_83c4_664b35b6bddc.slice/crio-253457ca04f012fe2a765acc057a2489fed276484648fec7fc179b69246664c7 WatchSource:0}: Error finding container 253457ca04f012fe2a765acc057a2489fed276484648fec7fc179b69246664c7: Status 404 returned error can't find the container with id 253457ca04f012fe2a765acc057a2489fed276484648fec7fc179b69246664c7 Apr 28 19:19:37.154810 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.154758 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc240119-b9e8-4033-8e7d-35cf4ddb7f18-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-p9df6\" (UID: \"cc240119-b9e8-4033-8e7d-35cf4ddb7f18\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" Apr 28 19:19:37.255751 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.255728 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc240119-b9e8-4033-8e7d-35cf4ddb7f18-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-p9df6\" (UID: \"cc240119-b9e8-4033-8e7d-35cf4ddb7f18\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" Apr 28 19:19:37.258567 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.258541 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc240119-b9e8-4033-8e7d-35cf4ddb7f18-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-p9df6\" (UID: \"cc240119-b9e8-4033-8e7d-35cf4ddb7f18\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" Apr 28 19:19:37.376016 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.375933 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" Apr 28 19:19:37.437566 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.437529 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d2bf842-725c-468d-abc9-ea477e7ed9e8","Type":"ContainerStarted","Data":"502abad3711022c0d07b237e81575900acad7f25761274bd9ec78b405e28db90"} Apr 28 19:19:37.437729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.437586 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d2bf842-725c-468d-abc9-ea477e7ed9e8","Type":"ContainerStarted","Data":"4f42fbef8ea771f92f2cf0e1847c00b0cb5657eeb5a2f7405b81d45ec4b4f431"} Apr 28 19:19:37.437729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.437600 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d2bf842-725c-468d-abc9-ea477e7ed9e8","Type":"ContainerStarted","Data":"a0846760c6a174f574ef42b5669ee8975e46fe2e01153543d08350822a67eabd"} Apr 28 19:19:37.437729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.437624 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d2bf842-725c-468d-abc9-ea477e7ed9e8","Type":"ContainerStarted","Data":"954e808e8c517744bbd7a75417c91d6806aa652dbf31c23a5d645e5d3cd35ef2"} Apr 28 19:19:37.437729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.437641 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d2bf842-725c-468d-abc9-ea477e7ed9e8","Type":"ContainerStarted","Data":"68a9316956633440c2fb7adeb08c43fd1ca3f4fe81defe764d3f8e93d9b2e441"} Apr 28 19:19:37.438772 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.438744 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64d997c947-qw75t" event={"ID":"0b855502-4514-43b3-83c4-664b35b6bddc","Type":"ContainerStarted","Data":"253457ca04f012fe2a765acc057a2489fed276484648fec7fc179b69246664c7"} Apr 28 19:19:37.501069 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.501043 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6"] Apr 28 19:19:37.503382 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:19:37.503353 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc240119_b9e8_4033_8e7d_35cf4ddb7f18.slice/crio-68746927fdc416def77d850c8b1a829f5385c98c598765096d9bb036f88e468b WatchSource:0}: Error finding container 68746927fdc416def77d850c8b1a829f5385c98c598765096d9bb036f88e468b: Status 404 returned error can't find the container with id 68746927fdc416def77d850c8b1a829f5385c98c598765096d9bb036f88e468b Apr 28 19:19:37.953170 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:37.953102 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6b857bd4-zkbhf"] Apr 28 19:19:38.443004 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:38.442959 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" event={"ID":"cc240119-b9e8-4033-8e7d-35cf4ddb7f18","Type":"ContainerStarted","Data":"68746927fdc416def77d850c8b1a829f5385c98c598765096d9bb036f88e468b"} Apr 28 19:19:39.364466 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:39.364430 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:19:39.447335 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:39.447292 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" event={"ID":"cc240119-b9e8-4033-8e7d-35cf4ddb7f18","Type":"ContainerStarted","Data":"3044d3d8a7b78b5d6260bd2c9b8d3f81affb70da7b3ea52c6bd7a772e32ee973"} Apr 28 19:19:39.447511 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:39.447416 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" Apr 28 19:19:39.450585 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:39.450558 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d2bf842-725c-468d-abc9-ea477e7ed9e8","Type":"ContainerStarted","Data":"095709ed08d745bcfc28e3d1549c18e10c6894edbaadd9e5c24f0d9e368cae64"} Apr 28 19:19:39.451777 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:39.451757 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64d997c947-qw75t" event={"ID":"0b855502-4514-43b3-83c4-664b35b6bddc","Type":"ContainerStarted","Data":"a39f7f6d61f71f1e0e91870971caa2f28694012f1ccb8cfa1a81ede6d017c579"} Apr 28 19:19:39.452753 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:39.452733 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" Apr 28 19:19:39.471921 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:39.471873 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p9df6" podStartSLOduration=1.094058749 podStartE2EDuration="2.47185991s" podCreationTimestamp="2026-04-28 19:19:37 +0000 UTC" firstStartedPulling="2026-04-28 19:19:37.505185468 +0000 UTC m=+204.256734960" lastFinishedPulling="2026-04-28 19:19:38.882986628 +0000 UTC m=+205.634536121" observedRunningTime="2026-04-28 19:19:39.470528989 +0000 UTC m=+206.222078516" watchObservedRunningTime="2026-04-28 19:19:39.47185991 +0000 UTC m=+206.223409403" Apr 28 19:19:39.539955 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:39.539898 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-64d997c947-qw75t" podStartSLOduration=1.779781545 podStartE2EDuration="3.539884428s" podCreationTimestamp="2026-04-28 19:19:36 +0000 UTC" firstStartedPulling="2026-04-28 19:19:37.121305108 +0000 UTC m=+203.872854614" lastFinishedPulling="2026-04-28 19:19:38.881408004 +0000 UTC m=+205.632957497" observedRunningTime="2026-04-28 19:19:39.4980872 +0000 UTC m=+206.249636726" watchObservedRunningTime="2026-04-28 19:19:39.539884428 +0000 UTC m=+206.291433944" Apr 28 19:19:39.540686 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:39.540651 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.01967988 podStartE2EDuration="6.540641277s" podCreationTimestamp="2026-04-28 19:19:33 +0000 UTC" firstStartedPulling="2026-04-28 19:19:34.35914087 +0000 UTC m=+201.110690363" lastFinishedPulling="2026-04-28 19:19:38.880102252 +0000 UTC m=+205.631651760" observedRunningTime="2026-04-28 19:19:39.538496798 +0000 UTC m=+206.290046313" watchObservedRunningTime="2026-04-28 19:19:39.540641277 +0000 UTC m=+206.292190791" Apr 28 19:19:40.126166 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.126133 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-745bf7dd44-hft5k"] Apr 28 19:19:40.126397 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:19:40.126378 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" podUID="9622feba-01a7-434b-84bb-f677851aaa37" Apr 28 19:19:40.454851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.454763 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:19:40.459245 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.459227 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:19:40.588889 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.588863 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-installation-pull-secrets\") pod \"9622feba-01a7-434b-84bb-f677851aaa37\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " Apr 28 19:19:40.589041 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.588905 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-bound-sa-token\") pod \"9622feba-01a7-434b-84bb-f677851aaa37\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " Apr 28 19:19:40.589041 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.588938 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smz6v\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-kube-api-access-smz6v\") pod \"9622feba-01a7-434b-84bb-f677851aaa37\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " Apr 28 19:19:40.589041 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.588969 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9622feba-01a7-434b-84bb-f677851aaa37-ca-trust-extracted\") pod \"9622feba-01a7-434b-84bb-f677851aaa37\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " Apr 28 19:19:40.589041 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.589030 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-registry-certificates\") pod \"9622feba-01a7-434b-84bb-f677851aaa37\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " Apr 28 19:19:40.589246 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.589080 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-image-registry-private-configuration\") pod \"9622feba-01a7-434b-84bb-f677851aaa37\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " Apr 28 19:19:40.589246 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.589104 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-trusted-ca\") pod \"9622feba-01a7-434b-84bb-f677851aaa37\" (UID: \"9622feba-01a7-434b-84bb-f677851aaa37\") " Apr 28 19:19:40.589341 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.589279 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9622feba-01a7-434b-84bb-f677851aaa37-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9622feba-01a7-434b-84bb-f677851aaa37" (UID: "9622feba-01a7-434b-84bb-f677851aaa37"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:19:40.591700 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.590274 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9622feba-01a7-434b-84bb-f677851aaa37" (UID: "9622feba-01a7-434b-84bb-f677851aaa37"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:40.593571 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.593543 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9622feba-01a7-434b-84bb-f677851aaa37" (UID: "9622feba-01a7-434b-84bb-f677851aaa37"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:40.595815 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.594129 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9622feba-01a7-434b-84bb-f677851aaa37" (UID: "9622feba-01a7-434b-84bb-f677851aaa37"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:40.595815 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.594153 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "9622feba-01a7-434b-84bb-f677851aaa37" (UID: "9622feba-01a7-434b-84bb-f677851aaa37"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:40.595815 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.594698 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9622feba-01a7-434b-84bb-f677851aaa37-ca-trust-extracted\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:19:40.595815 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.594720 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-registry-certificates\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:19:40.596094 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.596062 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9622feba-01a7-434b-84bb-f677851aaa37" (UID: "9622feba-01a7-434b-84bb-f677851aaa37"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:40.596197 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.596169 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-kube-api-access-smz6v" (OuterVolumeSpecName: "kube-api-access-smz6v") pod "9622feba-01a7-434b-84bb-f677851aaa37" (UID: "9622feba-01a7-434b-84bb-f677851aaa37"). InnerVolumeSpecName "kube-api-access-smz6v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:40.695589 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.695558 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-image-registry-private-configuration\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:19:40.695589 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.695586 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9622feba-01a7-434b-84bb-f677851aaa37-trusted-ca\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:19:40.695589 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.695596 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9622feba-01a7-434b-84bb-f677851aaa37-installation-pull-secrets\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:19:40.695806 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.695622 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-bound-sa-token\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:19:40.695806 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:40.695632 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-smz6v\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-kube-api-access-smz6v\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:19:41.457511 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:41.457479 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-745bf7dd44-hft5k" Apr 28 19:19:41.517339 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:41.517307 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-745bf7dd44-hft5k"] Apr 28 19:19:41.532230 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:41.532204 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-745bf7dd44-hft5k"] Apr 28 19:19:41.602913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:41.602876 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9622feba-01a7-434b-84bb-f677851aaa37-registry-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:19:41.887533 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:41.887499 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9622feba-01a7-434b-84bb-f677851aaa37" path="/var/lib/kubelet/pods/9622feba-01a7-434b-84bb-f677851aaa37/volumes" Apr 28 19:19:56.842528 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:56.842488 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:19:56.842528 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:19:56.842531 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:20:02.982873 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:02.982832 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c6b857bd4-zkbhf" podUID="5ef5b7c5-142f-4343-9493-0b064c362185" containerName="console" containerID="cri-o://960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3" gracePeriod=15 Apr 28 19:20:03.219794 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.219771 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6b857bd4-zkbhf_5ef5b7c5-142f-4343-9493-0b064c362185/console/0.log" Apr 28 19:20:03.219908 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.219830 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:20:03.391646 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.391599 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-serving-cert\") pod \"5ef5b7c5-142f-4343-9493-0b064c362185\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " Apr 28 19:20:03.391837 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.391666 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-trusted-ca-bundle\") pod \"5ef5b7c5-142f-4343-9493-0b064c362185\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " Apr 28 19:20:03.391837 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.391740 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-console-config\") pod \"5ef5b7c5-142f-4343-9493-0b064c362185\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " Apr 28 19:20:03.391837 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.391790 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-oauth-serving-cert\") pod \"5ef5b7c5-142f-4343-9493-0b064c362185\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " Apr 28 19:20:03.391998 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.391836 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-service-ca\") pod \"5ef5b7c5-142f-4343-9493-0b064c362185\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " Apr 28 19:20:03.391998 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.391860 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-oauth-config\") pod \"5ef5b7c5-142f-4343-9493-0b064c362185\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " Apr 28 19:20:03.391998 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.391902 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlbdp\" (UniqueName: \"kubernetes.io/projected/5ef5b7c5-142f-4343-9493-0b064c362185-kube-api-access-mlbdp\") pod \"5ef5b7c5-142f-4343-9493-0b064c362185\" (UID: \"5ef5b7c5-142f-4343-9493-0b064c362185\") " Apr 28 19:20:03.392229 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.392128 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5ef5b7c5-142f-4343-9493-0b064c362185" (UID: "5ef5b7c5-142f-4343-9493-0b064c362185"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:03.392229 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.392188 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-console-config" (OuterVolumeSpecName: "console-config") pod "5ef5b7c5-142f-4343-9493-0b064c362185" (UID: "5ef5b7c5-142f-4343-9493-0b064c362185"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:03.392302 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.392234 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5ef5b7c5-142f-4343-9493-0b064c362185" (UID: "5ef5b7c5-142f-4343-9493-0b064c362185"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:03.392302 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.392241 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-service-ca" (OuterVolumeSpecName: "service-ca") pod "5ef5b7c5-142f-4343-9493-0b064c362185" (UID: "5ef5b7c5-142f-4343-9493-0b064c362185"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:03.394182 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.394155 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5ef5b7c5-142f-4343-9493-0b064c362185" (UID: "5ef5b7c5-142f-4343-9493-0b064c362185"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:03.394182 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.394175 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5ef5b7c5-142f-4343-9493-0b064c362185" (UID: "5ef5b7c5-142f-4343-9493-0b064c362185"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:03.394322 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.394196 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef5b7c5-142f-4343-9493-0b064c362185-kube-api-access-mlbdp" (OuterVolumeSpecName: "kube-api-access-mlbdp") pod "5ef5b7c5-142f-4343-9493-0b064c362185" (UID: "5ef5b7c5-142f-4343-9493-0b064c362185"). InnerVolumeSpecName "kube-api-access-mlbdp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:20:03.492999 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.492959 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-serving-cert\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:20:03.492999 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.492991 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-trusted-ca-bundle\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:20:03.492999 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.493002 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-console-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:20:03.492999 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.493012 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-oauth-serving-cert\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:20:03.493256 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.493022 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ef5b7c5-142f-4343-9493-0b064c362185-service-ca\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:20:03.493256 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.493031 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ef5b7c5-142f-4343-9493-0b064c362185-console-oauth-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:20:03.493256 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.493039 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlbdp\" (UniqueName: \"kubernetes.io/projected/5ef5b7c5-142f-4343-9493-0b064c362185-kube-api-access-mlbdp\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:20:03.516310 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.516285 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6b857bd4-zkbhf_5ef5b7c5-142f-4343-9493-0b064c362185/console/0.log" Apr 28 19:20:03.516449 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.516323 2570 generic.go:358] "Generic (PLEG): container finished" podID="5ef5b7c5-142f-4343-9493-0b064c362185" containerID="960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3" exitCode=2 Apr 28 19:20:03.516449 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.516356 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6b857bd4-zkbhf" event={"ID":"5ef5b7c5-142f-4343-9493-0b064c362185","Type":"ContainerDied","Data":"960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3"} Apr 28 19:20:03.516449 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.516390 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6b857bd4-zkbhf" Apr 28 19:20:03.516449 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.516399 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6b857bd4-zkbhf" event={"ID":"5ef5b7c5-142f-4343-9493-0b064c362185","Type":"ContainerDied","Data":"dcf3145ba0ae32d3d54a8ad486c0d6a7545e6573db6b7ee556892715be4b21e2"} Apr 28 19:20:03.516449 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.516417 2570 scope.go:117] "RemoveContainer" containerID="960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3" Apr 28 19:20:03.524814 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.524799 2570 scope.go:117] "RemoveContainer" containerID="960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3" Apr 28 19:20:03.525062 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:20:03.525040 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3\": container with ID starting with 960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3 not found: ID does not exist" containerID="960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3" Apr 28 19:20:03.525109 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.525070 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3"} err="failed to get container status \"960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3\": rpc error: code = NotFound desc = could not find container \"960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3\": container with ID starting with 960c7ed3c1f21122834a420487e221ddb029e31aec938ec91931515514a83da3 not found: ID does not exist" Apr 28 19:20:03.537774 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.537755 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6b857bd4-zkbhf"] Apr 28 19:20:03.542589 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.542564 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c6b857bd4-zkbhf"] Apr 28 19:20:03.886991 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:03.886949 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef5b7c5-142f-4343-9493-0b064c362185" path="/var/lib/kubelet/pods/5ef5b7c5-142f-4343-9493-0b064c362185/volumes" Apr 28 19:20:10.210779 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:10.210751 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pmpq5_68ecdbce-5bdd-408b-bef6-91e797899886/dns-node-resolver/0.log" Apr 28 19:20:16.848375 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:16.848341 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:20:16.852348 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:16.852322 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-64d997c947-qw75t" Apr 28 19:20:23.576011 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:23.575955 2570 generic.go:358] "Generic (PLEG): container finished" podID="1c847697-21da-48aa-9af0-aa0cf114d47c" containerID="fc9e12b1db509a19c62237b47590685b7f9bb70a9653768a9bc88589fe8749ba" exitCode=0 Apr 28 19:20:23.576374 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:23.576025 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" event={"ID":"1c847697-21da-48aa-9af0-aa0cf114d47c","Type":"ContainerDied","Data":"fc9e12b1db509a19c62237b47590685b7f9bb70a9653768a9bc88589fe8749ba"} Apr 28 19:20:23.576374 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:23.576344 2570 scope.go:117] "RemoveContainer" containerID="fc9e12b1db509a19c62237b47590685b7f9bb70a9653768a9bc88589fe8749ba" Apr 28 19:20:24.580650 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:24.580595 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6bn75" event={"ID":"1c847697-21da-48aa-9af0-aa0cf114d47c","Type":"ContainerStarted","Data":"e18873a02cebacc26808897f25b15367b721fc41b77163b8ffba89c10545ab4e"} Apr 28 19:20:25.685980 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:25.685923 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:20:25.688337 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:25.688309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b961ce3-ed85-40f4-840c-df0e74d830dd-metrics-certs\") pod \"network-metrics-daemon-qgcjb\" (UID: \"0b961ce3-ed85-40f4-840c-df0e74d830dd\") " pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:20:25.787488 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:25.787454 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wz9tz\"" Apr 28 19:20:25.795564 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:25.795541 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgcjb" Apr 28 19:20:25.916059 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:25.916035 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qgcjb"] Apr 28 19:20:25.917935 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:20:25.917905 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b961ce3_ed85_40f4_840c_df0e74d830dd.slice/crio-7432e3987578c507bb7d9da801a4116fb1210560542e08baeb80fbfd1f7186e9 WatchSource:0}: Error finding container 7432e3987578c507bb7d9da801a4116fb1210560542e08baeb80fbfd1f7186e9: Status 404 returned error can't find the container with id 7432e3987578c507bb7d9da801a4116fb1210560542e08baeb80fbfd1f7186e9 Apr 28 19:20:26.587305 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:26.587273 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qgcjb" event={"ID":"0b961ce3-ed85-40f4-840c-df0e74d830dd","Type":"ContainerStarted","Data":"7432e3987578c507bb7d9da801a4116fb1210560542e08baeb80fbfd1f7186e9"} Apr 28 19:20:27.591662 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:27.591620 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qgcjb" event={"ID":"0b961ce3-ed85-40f4-840c-df0e74d830dd","Type":"ContainerStarted","Data":"e026e815870434cbba6138467c673c6e432e41079d76d56eab026dc9060bf620"} Apr 28 19:20:27.591662 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:27.591661 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qgcjb" event={"ID":"0b961ce3-ed85-40f4-840c-df0e74d830dd","Type":"ContainerStarted","Data":"11c8940d594ac97396f7bd94437073cbc5be53fc1172e199d3c949dd6e3a50a8"} Apr 28 19:20:27.623172 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:27.623121 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qgcjb" podStartSLOduration=253.685267268 podStartE2EDuration="4m14.623102701s" podCreationTimestamp="2026-04-28 19:16:13 +0000 UTC" firstStartedPulling="2026-04-28 19:20:25.919689025 +0000 UTC m=+252.671238519" lastFinishedPulling="2026-04-28 19:20:26.85752446 +0000 UTC m=+253.609073952" observedRunningTime="2026-04-28 19:20:27.623029431 +0000 UTC m=+254.374578946" watchObservedRunningTime="2026-04-28 19:20:27.623102701 +0000 UTC m=+254.374652222" Apr 28 19:20:53.280932 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:20:53.280885 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-98b4d" podUID="801789ef-975e-451d-9e18-0cb9acd739d6" Apr 28 19:20:53.280932 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:20:53.280893 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" podUID="7e4b5b64-ebf0-4ee2-a43b-35098459ff73" Apr 28 19:20:53.280932 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:20:53.280902 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-l8dqg" podUID="0d073d08-7217-4136-8485-03d574acfc52" Apr 28 19:20:53.662301 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:53.662222 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:20:53.662459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:53.662223 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-98b4d" Apr 28 19:20:53.662459 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:53.662224 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:20:56.665851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.665791 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:20:56.665851 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.665855 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:20:56.666481 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.665887 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:20:56.668347 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.668320 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/801789ef-975e-451d-9e18-0cb9acd739d6-metrics-tls\") pod \"dns-default-98b4d\" (UID: \"801789ef-975e-451d-9e18-0cb9acd739d6\") " pod="openshift-dns/dns-default-98b4d" Apr 28 19:20:56.668490 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.668380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d073d08-7217-4136-8485-03d574acfc52-cert\") pod \"ingress-canary-l8dqg\" (UID: \"0d073d08-7217-4136-8485-03d574acfc52\") " pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:20:56.668490 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.668401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e4b5b64-ebf0-4ee2-a43b-35098459ff73-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-x865s\" (UID: \"7e4b5b64-ebf0-4ee2-a43b-35098459ff73\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:20:56.685792 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.685760 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-655b965ffc-gvwdf"] Apr 28 19:20:56.686122 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.686106 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ef5b7c5-142f-4343-9493-0b064c362185" containerName="console" Apr 28 19:20:56.686196 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.686128 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef5b7c5-142f-4343-9493-0b064c362185" containerName="console" Apr 28 19:20:56.686249 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.686195 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ef5b7c5-142f-4343-9493-0b064c362185" containerName="console" Apr 28 19:20:56.690748 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.690728 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.693453 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.693433 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 28 19:20:56.693551 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.693481 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 28 19:20:56.694392 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.694378 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5qphr\"" Apr 28 19:20:56.694509 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.694493 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 28 19:20:56.696887 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.696867 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 28 19:20:56.696991 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.696958 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 28 19:20:56.707966 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.707938 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-655b965ffc-gvwdf"] Apr 28 19:20:56.710581 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.710564 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 28 19:20:56.766996 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.766964 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6771cfc5-48ff-4141-9251-f539854216fc-serving-certs-ca-bundle\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.767169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.767014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-secret-telemeter-client\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.767169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.767034 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-federate-client-tls\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.767169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.767053 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6771cfc5-48ff-4141-9251-f539854216fc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.767169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.767099 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4c5l\" (UniqueName: \"kubernetes.io/projected/6771cfc5-48ff-4141-9251-f539854216fc-kube-api-access-p4c5l\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.767169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.767121 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-telemeter-client-tls\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.767169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.767146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6771cfc5-48ff-4141-9251-f539854216fc-metrics-client-ca\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.767372 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.767201 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.868351 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.868318 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-federate-client-tls\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.868532 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.868359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6771cfc5-48ff-4141-9251-f539854216fc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.868532 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.868386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4c5l\" (UniqueName: \"kubernetes.io/projected/6771cfc5-48ff-4141-9251-f539854216fc-kube-api-access-p4c5l\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.868532 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.868408 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-telemeter-client-tls\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.868532 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.868436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6771cfc5-48ff-4141-9251-f539854216fc-metrics-client-ca\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.868532 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.868463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.868532 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.868513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6771cfc5-48ff-4141-9251-f539854216fc-serving-certs-ca-bundle\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.868882 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.868568 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-secret-telemeter-client\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.869282 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.869253 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6771cfc5-48ff-4141-9251-f539854216fc-metrics-client-ca\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.869413 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.869389 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6771cfc5-48ff-4141-9251-f539854216fc-serving-certs-ca-bundle\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.869462 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.869388 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6771cfc5-48ff-4141-9251-f539854216fc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.871424 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.871388 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.871551 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.871536 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-federate-client-tls\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.871656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.871636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-telemeter-client-tls\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.871876 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.871861 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6771cfc5-48ff-4141-9251-f539854216fc-secret-telemeter-client\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.876904 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.876885 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4c5l\" (UniqueName: \"kubernetes.io/projected/6771cfc5-48ff-4141-9251-f539854216fc-kube-api-access-p4c5l\") pod \"telemeter-client-655b965ffc-gvwdf\" (UID: \"6771cfc5-48ff-4141-9251-f539854216fc\") " pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:56.965629 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.965523 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pzmft\"" Apr 28 19:20:56.965629 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.965523 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-lfxd4\"" Apr 28 19:20:56.965629 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.965527 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6qdcw\"" Apr 28 19:20:56.973795 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.973779 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l8dqg" Apr 28 19:20:56.973795 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.973789 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" Apr 28 19:20:56.973913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:56.973830 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-98b4d" Apr 28 19:20:57.001268 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:57.000931 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" Apr 28 19:20:57.156868 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:57.156841 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l8dqg"] Apr 28 19:20:57.159230 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:20:57.159196 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d073d08_7217_4136_8485_03d574acfc52.slice/crio-7cbd9a593fbfd12eabad81650a12d4933d71be89d5e7a69079814dd4a46d594d WatchSource:0}: Error finding container 7cbd9a593fbfd12eabad81650a12d4933d71be89d5e7a69079814dd4a46d594d: Status 404 returned error can't find the container with id 7cbd9a593fbfd12eabad81650a12d4933d71be89d5e7a69079814dd4a46d594d Apr 28 19:20:57.170304 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:57.170186 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-98b4d"] Apr 28 19:20:57.172437 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:20:57.172408 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801789ef_975e_451d_9e18_0cb9acd739d6.slice/crio-2beae5ed2175b94d903f02dfc66c957ff816d927ca34911b8900addb349b2533 WatchSource:0}: Error finding container 2beae5ed2175b94d903f02dfc66c957ff816d927ca34911b8900addb349b2533: Status 404 returned error can't find the container with id 2beae5ed2175b94d903f02dfc66c957ff816d927ca34911b8900addb349b2533 Apr 28 19:20:57.188977 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:57.188953 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-x865s"] Apr 28 19:20:57.191289 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:20:57.191250 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e4b5b64_ebf0_4ee2_a43b_35098459ff73.slice/crio-0a743351252faf809c46e050a7e34eb309a3479356db3ea1a27c807ae2cb7366 WatchSource:0}: Error finding container 0a743351252faf809c46e050a7e34eb309a3479356db3ea1a27c807ae2cb7366: Status 404 returned error can't find the container with id 0a743351252faf809c46e050a7e34eb309a3479356db3ea1a27c807ae2cb7366 Apr 28 19:20:57.223050 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:57.223030 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-655b965ffc-gvwdf"] Apr 28 19:20:57.224664 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:20:57.224635 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6771cfc5_48ff_4141_9251_f539854216fc.slice/crio-75f7e477724062a2762d781586c7a563803e390a1f46a008732c808afece8a66 WatchSource:0}: Error finding container 75f7e477724062a2762d781586c7a563803e390a1f46a008732c808afece8a66: Status 404 returned error can't find the container with id 75f7e477724062a2762d781586c7a563803e390a1f46a008732c808afece8a66 Apr 28 19:20:57.674984 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:57.674876 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" event={"ID":"7e4b5b64-ebf0-4ee2-a43b-35098459ff73","Type":"ContainerStarted","Data":"0a743351252faf809c46e050a7e34eb309a3479356db3ea1a27c807ae2cb7366"} Apr 28 19:20:57.676135 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:57.676102 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-98b4d" event={"ID":"801789ef-975e-451d-9e18-0cb9acd739d6","Type":"ContainerStarted","Data":"2beae5ed2175b94d903f02dfc66c957ff816d927ca34911b8900addb349b2533"} Apr 28 19:20:57.677644 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:57.677590 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l8dqg" event={"ID":"0d073d08-7217-4136-8485-03d574acfc52","Type":"ContainerStarted","Data":"7cbd9a593fbfd12eabad81650a12d4933d71be89d5e7a69079814dd4a46d594d"} Apr 28 19:20:57.679057 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:20:57.679031 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" event={"ID":"6771cfc5-48ff-4141-9251-f539854216fc","Type":"ContainerStarted","Data":"75f7e477724062a2762d781586c7a563803e390a1f46a008732c808afece8a66"} Apr 28 19:21:00.690164 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.690122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" event={"ID":"6771cfc5-48ff-4141-9251-f539854216fc","Type":"ContainerStarted","Data":"8504fa8c245a71a8d41b945e60ac31393eea49b21bbb0f7c69bb48cbe730d086"} Apr 28 19:21:00.690164 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.690167 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" event={"ID":"6771cfc5-48ff-4141-9251-f539854216fc","Type":"ContainerStarted","Data":"cf97f46ec6c0eca461b4102ad135f61adf23daeebd82da6b55186c23594b007e"} Apr 28 19:21:00.690634 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.690178 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" event={"ID":"6771cfc5-48ff-4141-9251-f539854216fc","Type":"ContainerStarted","Data":"4f9352147fecdaec9522849b8872820f7430b33a807b07101866cbac5573f964"} Apr 28 19:21:00.691377 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.691348 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" event={"ID":"7e4b5b64-ebf0-4ee2-a43b-35098459ff73","Type":"ContainerStarted","Data":"c7d3d2ab0c2eb21cbd4f2d278287d492173da3e307ade147fdb5bb3938275347"} Apr 28 19:21:00.692801 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.692781 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-98b4d" event={"ID":"801789ef-975e-451d-9e18-0cb9acd739d6","Type":"ContainerStarted","Data":"a4c9c278382b3dac33f5a0f5d9b30162926a7caa7cd422fc855af53753e83a33"} Apr 28 19:21:00.692859 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.692809 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-98b4d" event={"ID":"801789ef-975e-451d-9e18-0cb9acd739d6","Type":"ContainerStarted","Data":"8e5f1c19fdd898a7d917ea00b0a692984529096485c55651124677318676c301"} Apr 28 19:21:00.692903 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.692865 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-98b4d" Apr 28 19:21:00.693926 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.693901 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l8dqg" event={"ID":"0d073d08-7217-4136-8485-03d574acfc52","Type":"ContainerStarted","Data":"c5bf755e5ccba5ab8d6b033c7f8b305c30c816f300453c4eb37cc2d525c8eab4"} Apr 28 19:21:00.715825 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.715784 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-655b965ffc-gvwdf" podStartSLOduration=2.171992811 podStartE2EDuration="4.715770693s" podCreationTimestamp="2026-04-28 19:20:56 +0000 UTC" firstStartedPulling="2026-04-28 19:20:57.226464648 +0000 UTC m=+283.978014141" lastFinishedPulling="2026-04-28 19:20:59.770242515 +0000 UTC m=+286.521792023" observedRunningTime="2026-04-28 19:21:00.714411601 +0000 UTC m=+287.465961117" watchObservedRunningTime="2026-04-28 19:21:00.715770693 +0000 UTC m=+287.467320208" Apr 28 19:21:00.738533 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.738479 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-x865s" podStartSLOduration=284.166994144 podStartE2EDuration="4m46.738462014s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:20:57.193197485 +0000 UTC m=+283.944746982" lastFinishedPulling="2026-04-28 19:20:59.764665351 +0000 UTC m=+286.516214852" observedRunningTime="2026-04-28 19:21:00.737389498 +0000 UTC m=+287.488939043" watchObservedRunningTime="2026-04-28 19:21:00.738462014 +0000 UTC m=+287.490011529" Apr 28 19:21:00.783870 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.783815 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l8dqg" podStartSLOduration=252.178978553 podStartE2EDuration="4m14.783796855s" podCreationTimestamp="2026-04-28 19:16:46 +0000 UTC" firstStartedPulling="2026-04-28 19:20:57.161691183 +0000 UTC m=+283.913240677" lastFinishedPulling="2026-04-28 19:20:59.766509471 +0000 UTC m=+286.518058979" observedRunningTime="2026-04-28 19:21:00.782501062 +0000 UTC m=+287.534050577" watchObservedRunningTime="2026-04-28 19:21:00.783796855 +0000 UTC m=+287.535346371" Apr 28 19:21:00.785087 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:00.785047 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-98b4d" podStartSLOduration=252.198012626 podStartE2EDuration="4m14.785032834s" podCreationTimestamp="2026-04-28 19:16:46 +0000 UTC" firstStartedPulling="2026-04-28 19:20:57.178235109 +0000 UTC m=+283.929784619" lastFinishedPulling="2026-04-28 19:20:59.765255328 +0000 UTC m=+286.516804827" observedRunningTime="2026-04-28 19:21:00.760995249 +0000 UTC m=+287.512544763" watchObservedRunningTime="2026-04-28 19:21:00.785032834 +0000 UTC m=+287.536582351" Apr 28 19:21:10.700177 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:10.700139 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-98b4d" Apr 28 19:21:13.801884 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:13.801857 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:21:13.802363 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:13.802339 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:21:13.804419 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:21:13.804399 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:26:07.061620 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.061567 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jw2nc"] Apr 28 19:26:07.064782 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.064759 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.068116 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.068088 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 28 19:26:07.068245 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.068121 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 28 19:26:07.069009 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.068981 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 28 19:26:07.069387 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.069366 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 28 19:26:07.069489 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.069387 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 28 19:26:07.069489 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.069376 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-gkwcz\"" Apr 28 19:26:07.080192 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.080166 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jw2nc"] Apr 28 19:26:07.231705 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.231669 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwzqh\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-kube-api-access-hwzqh\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.231705 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.231707 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.231922 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.231734 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3d018771-bd5e-45d8-8c5a-439ad3b28285-cabundle0\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.332419 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.332341 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwzqh\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-kube-api-access-hwzqh\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.332419 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.332377 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.332419 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.332403 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3d018771-bd5e-45d8-8c5a-439ad3b28285-cabundle0\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.332664 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:07.332504 2570 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:26:07.332664 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:07.332522 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:26:07.332664 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:07.332534 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jw2nc: references non-existent secret key: ca.crt Apr 28 19:26:07.332664 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:07.332634 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates podName:3d018771-bd5e-45d8-8c5a-439ad3b28285 nodeName:}" failed. No retries permitted until 2026-04-28 19:26:07.832588639 +0000 UTC m=+594.584138146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates") pod "keda-operator-ffbb595cb-jw2nc" (UID: "3d018771-bd5e-45d8-8c5a-439ad3b28285") : references non-existent secret key: ca.crt Apr 28 19:26:07.333094 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.333075 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3d018771-bd5e-45d8-8c5a-439ad3b28285-cabundle0\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.349096 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.349068 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwzqh\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-kube-api-access-hwzqh\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.836944 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:07.836912 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:07.837106 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:07.837017 2570 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:26:07.837106 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:07.837028 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:26:07.837106 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:07.837036 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jw2nc: references non-existent secret key: ca.crt Apr 28 19:26:07.837106 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:07.837082 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates podName:3d018771-bd5e-45d8-8c5a-439ad3b28285 nodeName:}" failed. No retries permitted until 2026-04-28 19:26:08.83707004 +0000 UTC m=+595.588619532 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates") pod "keda-operator-ffbb595cb-jw2nc" (UID: "3d018771-bd5e-45d8-8c5a-439ad3b28285") : references non-existent secret key: ca.crt Apr 28 19:26:08.847087 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:08.847052 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:08.847464 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:08.847208 2570 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:26:08.847464 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:08.847231 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:26:08.847464 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:08.847240 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jw2nc: references non-existent secret key: ca.crt Apr 28 19:26:08.847464 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:08.847301 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates podName:3d018771-bd5e-45d8-8c5a-439ad3b28285 nodeName:}" failed. No retries permitted until 2026-04-28 19:26:10.847287268 +0000 UTC m=+597.598836760 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates") pod "keda-operator-ffbb595cb-jw2nc" (UID: "3d018771-bd5e-45d8-8c5a-439ad3b28285") : references non-existent secret key: ca.crt Apr 28 19:26:10.863343 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:10.863290 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:10.863741 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:10.863441 2570 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:26:10.863741 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:10.863457 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:26:10.863741 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:10.863467 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jw2nc: references non-existent secret key: ca.crt Apr 28 19:26:10.863741 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:26:10.863532 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates podName:3d018771-bd5e-45d8-8c5a-439ad3b28285 nodeName:}" failed. No retries permitted until 2026-04-28 19:26:14.863516281 +0000 UTC m=+601.615065779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates") pod "keda-operator-ffbb595cb-jw2nc" (UID: "3d018771-bd5e-45d8-8c5a-439ad3b28285") : references non-existent secret key: ca.crt Apr 28 19:26:13.826103 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:13.826072 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:26:13.826538 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:13.826370 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:26:14.895601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:14.895561 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:14.898010 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:14.897987 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3d018771-bd5e-45d8-8c5a-439ad3b28285-certificates\") pod \"keda-operator-ffbb595cb-jw2nc\" (UID: \"3d018771-bd5e-45d8-8c5a-439ad3b28285\") " pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:15.179865 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:15.179777 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-gkwcz\"" Apr 28 19:26:15.188210 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:15.188186 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:15.319467 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:15.319337 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jw2nc"] Apr 28 19:26:15.322238 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:26:15.322209 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d018771_bd5e_45d8_8c5a_439ad3b28285.slice/crio-55ed521176703ca6446ede6f71f47a4aa63354521d74a8f329198b592cec15ec WatchSource:0}: Error finding container 55ed521176703ca6446ede6f71f47a4aa63354521d74a8f329198b592cec15ec: Status 404 returned error can't find the container with id 55ed521176703ca6446ede6f71f47a4aa63354521d74a8f329198b592cec15ec Apr 28 19:26:15.323469 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:15.323449 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:26:15.581165 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:15.581129 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" event={"ID":"3d018771-bd5e-45d8-8c5a-439ad3b28285","Type":"ContainerStarted","Data":"55ed521176703ca6446ede6f71f47a4aa63354521d74a8f329198b592cec15ec"} Apr 28 19:26:19.595688 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:19.595647 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" event={"ID":"3d018771-bd5e-45d8-8c5a-439ad3b28285","Type":"ContainerStarted","Data":"4d2c7ae8e23568f92e5ba9d588022dbbed37697b9b615be146d5e075c155727b"} Apr 28 19:26:19.596138 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:19.595811 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:26:19.617019 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:19.616963 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" podStartSLOduration=8.733812782 podStartE2EDuration="12.616949184s" podCreationTimestamp="2026-04-28 19:26:07 +0000 UTC" firstStartedPulling="2026-04-28 19:26:15.323583392 +0000 UTC m=+602.075132890" lastFinishedPulling="2026-04-28 19:26:19.206719799 +0000 UTC m=+605.958269292" observedRunningTime="2026-04-28 19:26:19.615441131 +0000 UTC m=+606.366990646" watchObservedRunningTime="2026-04-28 19:26:19.616949184 +0000 UTC m=+606.368498677" Apr 28 19:26:40.601149 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:26:40.601069 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-jw2nc" Apr 28 19:27:14.744782 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.744743 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b85c69797-mbjnv"] Apr 28 19:27:14.748156 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.748135 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:14.750912 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.750888 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:27:14.751040 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.750932 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:27:14.751299 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.751281 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 28 19:27:14.751394 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.751299 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-4lzgj\"" Apr 28 19:27:14.751875 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.751856 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vrckr"] Apr 28 19:27:14.754879 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.754861 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:14.756963 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.756937 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-ttgf6\"" Apr 28 19:27:14.757121 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.757103 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 28 19:27:14.759177 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.759150 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-mbjnv"] Apr 28 19:27:14.762266 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.762248 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vrckr"] Apr 28 19:27:14.798099 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.798070 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr5h2\" (UniqueName: \"kubernetes.io/projected/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-kube-api-access-vr5h2\") pod \"kserve-controller-manager-b85c69797-mbjnv\" (UID: \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\") " pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:14.798257 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.798106 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-cert\") pod \"kserve-controller-manager-b85c69797-mbjnv\" (UID: \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\") " pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:14.798257 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.798146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040ed491-9a87-4fdb-b568-1cc93ef448d9-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vrckr\" (UID: \"040ed491-9a87-4fdb-b568-1cc93ef448d9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:14.798257 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.798178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cbr2\" (UniqueName: \"kubernetes.io/projected/040ed491-9a87-4fdb-b568-1cc93ef448d9-kube-api-access-9cbr2\") pod \"llmisvc-controller-manager-68cc5db7c4-vrckr\" (UID: \"040ed491-9a87-4fdb-b568-1cc93ef448d9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:14.898701 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.898661 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr5h2\" (UniqueName: \"kubernetes.io/projected/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-kube-api-access-vr5h2\") pod \"kserve-controller-manager-b85c69797-mbjnv\" (UID: \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\") " pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:14.898897 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.898710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-cert\") pod \"kserve-controller-manager-b85c69797-mbjnv\" (UID: \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\") " pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:14.898897 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.898778 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040ed491-9a87-4fdb-b568-1cc93ef448d9-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vrckr\" (UID: \"040ed491-9a87-4fdb-b568-1cc93ef448d9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:14.898897 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.898823 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cbr2\" (UniqueName: \"kubernetes.io/projected/040ed491-9a87-4fdb-b568-1cc93ef448d9-kube-api-access-9cbr2\") pod \"llmisvc-controller-manager-68cc5db7c4-vrckr\" (UID: \"040ed491-9a87-4fdb-b568-1cc93ef448d9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:14.898897 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:27:14.898879 2570 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 28 19:27:14.899164 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:27:14.898939 2570 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 28 19:27:14.899164 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:27:14.898950 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-cert podName:b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c nodeName:}" failed. No retries permitted until 2026-04-28 19:27:15.39893279 +0000 UTC m=+662.150482291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-cert") pod "kserve-controller-manager-b85c69797-mbjnv" (UID: "b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c") : secret "kserve-webhook-server-cert" not found Apr 28 19:27:14.899164 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:27:14.898984 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040ed491-9a87-4fdb-b568-1cc93ef448d9-cert podName:040ed491-9a87-4fdb-b568-1cc93ef448d9 nodeName:}" failed. No retries permitted until 2026-04-28 19:27:15.398971926 +0000 UTC m=+662.150521422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/040ed491-9a87-4fdb-b568-1cc93ef448d9-cert") pod "llmisvc-controller-manager-68cc5db7c4-vrckr" (UID: "040ed491-9a87-4fdb-b568-1cc93ef448d9") : secret "llmisvc-webhook-server-cert" not found Apr 28 19:27:14.907990 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.907960 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr5h2\" (UniqueName: \"kubernetes.io/projected/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-kube-api-access-vr5h2\") pod \"kserve-controller-manager-b85c69797-mbjnv\" (UID: \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\") " pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:14.908106 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:14.908090 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cbr2\" (UniqueName: \"kubernetes.io/projected/040ed491-9a87-4fdb-b568-1cc93ef448d9-kube-api-access-9cbr2\") pod \"llmisvc-controller-manager-68cc5db7c4-vrckr\" (UID: \"040ed491-9a87-4fdb-b568-1cc93ef448d9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:15.403123 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:15.403084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-cert\") pod \"kserve-controller-manager-b85c69797-mbjnv\" (UID: \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\") " pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:15.403327 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:15.403164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040ed491-9a87-4fdb-b568-1cc93ef448d9-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vrckr\" (UID: \"040ed491-9a87-4fdb-b568-1cc93ef448d9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:15.405400 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:15.405376 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040ed491-9a87-4fdb-b568-1cc93ef448d9-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vrckr\" (UID: \"040ed491-9a87-4fdb-b568-1cc93ef448d9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:15.405533 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:15.405512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-cert\") pod \"kserve-controller-manager-b85c69797-mbjnv\" (UID: \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\") " pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:15.658660 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:15.658557 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:15.667210 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:15.667187 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:15.788672 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:15.788638 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-mbjnv"] Apr 28 19:27:15.791016 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:27:15.790984 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9a8c7f1_8b9e_44b5_9ef8_e5f3f02fd69c.slice/crio-a1b083028ce9744ca0933475d02cdd8909a898bfb0b1cf7c6880426a3084a180 WatchSource:0}: Error finding container a1b083028ce9744ca0933475d02cdd8909a898bfb0b1cf7c6880426a3084a180: Status 404 returned error can't find the container with id a1b083028ce9744ca0933475d02cdd8909a898bfb0b1cf7c6880426a3084a180 Apr 28 19:27:15.807273 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:15.807250 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vrckr"] Apr 28 19:27:15.809252 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:27:15.809224 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod040ed491_9a87_4fdb_b568_1cc93ef448d9.slice/crio-6882d638993f83323c563e8fa1ff14964b5e4eaf0743b45348f0a76885296959 WatchSource:0}: Error finding container 6882d638993f83323c563e8fa1ff14964b5e4eaf0743b45348f0a76885296959: Status 404 returned error can't find the container with id 6882d638993f83323c563e8fa1ff14964b5e4eaf0743b45348f0a76885296959 Apr 28 19:27:16.760582 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:16.760543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" event={"ID":"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c","Type":"ContainerStarted","Data":"a1b083028ce9744ca0933475d02cdd8909a898bfb0b1cf7c6880426a3084a180"} Apr 28 19:27:16.762811 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:16.762778 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" event={"ID":"040ed491-9a87-4fdb-b568-1cc93ef448d9","Type":"ContainerStarted","Data":"6882d638993f83323c563e8fa1ff14964b5e4eaf0743b45348f0a76885296959"} Apr 28 19:27:19.773455 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:19.773412 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" event={"ID":"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c","Type":"ContainerStarted","Data":"a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df"} Apr 28 19:27:19.773947 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:19.773663 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:19.774726 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:19.774695 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" event={"ID":"040ed491-9a87-4fdb-b568-1cc93ef448d9","Type":"ContainerStarted","Data":"def203301147535f695492e91ab559e217af1bf68ed6e5bde70123dd66c9a2fa"} Apr 28 19:27:19.774854 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:19.774783 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:19.790569 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:19.790511 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" podStartSLOduration=2.413251605 podStartE2EDuration="5.790491576s" podCreationTimestamp="2026-04-28 19:27:14 +0000 UTC" firstStartedPulling="2026-04-28 19:27:15.792566685 +0000 UTC m=+662.544116177" lastFinishedPulling="2026-04-28 19:27:19.169806651 +0000 UTC m=+665.921356148" observedRunningTime="2026-04-28 19:27:19.789836461 +0000 UTC m=+666.541386000" watchObservedRunningTime="2026-04-28 19:27:19.790491576 +0000 UTC m=+666.542041092" Apr 28 19:27:19.805553 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:19.805501 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" podStartSLOduration=2.45150501 podStartE2EDuration="5.805481068s" podCreationTimestamp="2026-04-28 19:27:14 +0000 UTC" firstStartedPulling="2026-04-28 19:27:15.810519062 +0000 UTC m=+662.562068555" lastFinishedPulling="2026-04-28 19:27:19.16449512 +0000 UTC m=+665.916044613" observedRunningTime="2026-04-28 19:27:19.80397566 +0000 UTC m=+666.555525177" watchObservedRunningTime="2026-04-28 19:27:19.805481068 +0000 UTC m=+666.557030584" Apr 28 19:27:50.780630 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:50.780584 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vrckr" Apr 28 19:27:50.783815 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:50.783787 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:52.132076 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.132037 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-mbjnv"] Apr 28 19:27:52.132521 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.132265 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" podUID="b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c" containerName="manager" containerID="cri-o://a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df" gracePeriod=10 Apr 28 19:27:52.152526 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.152486 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b85c69797-hj7b5"] Apr 28 19:27:52.155744 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.155720 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:27:52.164336 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.164302 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-hj7b5"] Apr 28 19:27:52.321157 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.321127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f426977-d622-4e4f-af1c-2db016314ae7-cert\") pod \"kserve-controller-manager-b85c69797-hj7b5\" (UID: \"3f426977-d622-4e4f-af1c-2db016314ae7\") " pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:27:52.321319 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.321190 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5l7s\" (UniqueName: \"kubernetes.io/projected/3f426977-d622-4e4f-af1c-2db016314ae7-kube-api-access-j5l7s\") pod \"kserve-controller-manager-b85c69797-hj7b5\" (UID: \"3f426977-d622-4e4f-af1c-2db016314ae7\") " pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:27:52.373575 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.373552 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:52.421984 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.421893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5l7s\" (UniqueName: \"kubernetes.io/projected/3f426977-d622-4e4f-af1c-2db016314ae7-kube-api-access-j5l7s\") pod \"kserve-controller-manager-b85c69797-hj7b5\" (UID: \"3f426977-d622-4e4f-af1c-2db016314ae7\") " pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:27:52.421984 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.421964 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f426977-d622-4e4f-af1c-2db016314ae7-cert\") pod \"kserve-controller-manager-b85c69797-hj7b5\" (UID: \"3f426977-d622-4e4f-af1c-2db016314ae7\") " pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:27:52.424512 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.424486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f426977-d622-4e4f-af1c-2db016314ae7-cert\") pod \"kserve-controller-manager-b85c69797-hj7b5\" (UID: \"3f426977-d622-4e4f-af1c-2db016314ae7\") " pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:27:52.430798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.430777 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5l7s\" (UniqueName: \"kubernetes.io/projected/3f426977-d622-4e4f-af1c-2db016314ae7-kube-api-access-j5l7s\") pod \"kserve-controller-manager-b85c69797-hj7b5\" (UID: \"3f426977-d622-4e4f-af1c-2db016314ae7\") " pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:27:52.511655 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.511595 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:27:52.522582 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.522559 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-cert\") pod \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\" (UID: \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\") " Apr 28 19:27:52.522701 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.522594 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr5h2\" (UniqueName: \"kubernetes.io/projected/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-kube-api-access-vr5h2\") pod \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\" (UID: \"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c\") " Apr 28 19:27:52.524807 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.524776 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-kube-api-access-vr5h2" (OuterVolumeSpecName: "kube-api-access-vr5h2") pod "b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c" (UID: "b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c"). InnerVolumeSpecName "kube-api-access-vr5h2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:27:52.524908 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.524861 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-cert" (OuterVolumeSpecName: "cert") pod "b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c" (UID: "b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:27:52.624124 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.624081 2570 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-cert\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:27:52.624124 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.624109 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vr5h2\" (UniqueName: \"kubernetes.io/projected/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c-kube-api-access-vr5h2\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:27:52.630928 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.630903 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-hj7b5"] Apr 28 19:27:52.633580 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:27:52.633551 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f426977_d622_4e4f_af1c_2db016314ae7.slice/crio-f9153cbd2df4a009a981afd9b38bcc32b20b66dd706b397aa245e0107722a0d3 WatchSource:0}: Error finding container f9153cbd2df4a009a981afd9b38bcc32b20b66dd706b397aa245e0107722a0d3: Status 404 returned error can't find the container with id f9153cbd2df4a009a981afd9b38bcc32b20b66dd706b397aa245e0107722a0d3 Apr 28 19:27:52.874585 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.874543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-hj7b5" event={"ID":"3f426977-d622-4e4f-af1c-2db016314ae7","Type":"ContainerStarted","Data":"f9153cbd2df4a009a981afd9b38bcc32b20b66dd706b397aa245e0107722a0d3"} Apr 28 19:27:52.875753 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.875726 2570 generic.go:358] "Generic (PLEG): container finished" podID="b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c" containerID="a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df" exitCode=0 Apr 28 19:27:52.875895 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.875809 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" Apr 28 19:27:52.875895 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.875811 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" event={"ID":"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c","Type":"ContainerDied","Data":"a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df"} Apr 28 19:27:52.875895 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.875851 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-mbjnv" event={"ID":"b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c","Type":"ContainerDied","Data":"a1b083028ce9744ca0933475d02cdd8909a898bfb0b1cf7c6880426a3084a180"} Apr 28 19:27:52.875895 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.875870 2570 scope.go:117] "RemoveContainer" containerID="a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df" Apr 28 19:27:52.884189 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.884168 2570 scope.go:117] "RemoveContainer" containerID="a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df" Apr 28 19:27:52.884458 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:27:52.884436 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df\": container with ID starting with a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df not found: ID does not exist" containerID="a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df" Apr 28 19:27:52.884559 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.884467 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df"} err="failed to get container status \"a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df\": rpc error: code = NotFound desc = could not find container \"a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df\": container with ID starting with a4a22c06b762e2dfd00c5d0dd0b0976fb9d0b558d29353c05563417fc0ba02df not found: ID does not exist" Apr 28 19:27:52.896388 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.896356 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-mbjnv"] Apr 28 19:27:52.899903 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:52.899877 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-mbjnv"] Apr 28 19:27:53.880505 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:53.880463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-hj7b5" event={"ID":"3f426977-d622-4e4f-af1c-2db016314ae7","Type":"ContainerStarted","Data":"63c9c2ee779c3b399a2593e347d93ca7d76d476fbe4f3207d9ae0d890f18bb8a"} Apr 28 19:27:53.880975 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:53.880569 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:27:53.887468 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:53.887438 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c" path="/var/lib/kubelet/pods/b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c/volumes" Apr 28 19:27:53.898234 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:27:53.898188 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b85c69797-hj7b5" podStartSLOduration=1.618798275 podStartE2EDuration="1.898174321s" podCreationTimestamp="2026-04-28 19:27:52 +0000 UTC" firstStartedPulling="2026-04-28 19:27:52.634970055 +0000 UTC m=+699.386519548" lastFinishedPulling="2026-04-28 19:27:52.914346102 +0000 UTC m=+699.665895594" observedRunningTime="2026-04-28 19:27:53.896129139 +0000 UTC m=+700.647678654" watchObservedRunningTime="2026-04-28 19:27:53.898174321 +0000 UTC m=+700.649723848" Apr 28 19:28:24.888849 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:24.888815 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b85c69797-hj7b5" Apr 28 19:28:42.477306 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.477266 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-f792c"] Apr 28 19:28:42.477796 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.477758 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c" containerName="manager" Apr 28 19:28:42.477796 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.477777 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c" containerName="manager" Apr 28 19:28:42.477910 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.477871 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9a8c7f1-8b9e-44b5-9ef8-e5f3f02fd69c" containerName="manager" Apr 28 19:28:42.480796 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.480775 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f792c" Apr 28 19:28:42.483564 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.483545 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z8xsc\"" Apr 28 19:28:42.483673 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.483574 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 28 19:28:42.488729 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.488709 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-f792c"] Apr 28 19:28:42.544641 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.544600 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7vdf\" (UniqueName: \"kubernetes.io/projected/3bd0ebf6-fcf4-4f72-825f-be5e630230a3-kube-api-access-l7vdf\") pod \"s3-init-f792c\" (UID: \"3bd0ebf6-fcf4-4f72-825f-be5e630230a3\") " pod="kserve/s3-init-f792c" Apr 28 19:28:42.645292 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.645262 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7vdf\" (UniqueName: \"kubernetes.io/projected/3bd0ebf6-fcf4-4f72-825f-be5e630230a3-kube-api-access-l7vdf\") pod \"s3-init-f792c\" (UID: \"3bd0ebf6-fcf4-4f72-825f-be5e630230a3\") " pod="kserve/s3-init-f792c" Apr 28 19:28:42.653987 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.653960 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7vdf\" (UniqueName: \"kubernetes.io/projected/3bd0ebf6-fcf4-4f72-825f-be5e630230a3-kube-api-access-l7vdf\") pod \"s3-init-f792c\" (UID: \"3bd0ebf6-fcf4-4f72-825f-be5e630230a3\") " pod="kserve/s3-init-f792c" Apr 28 19:28:42.789936 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.789901 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f792c" Apr 28 19:28:42.907298 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:42.907241 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-f792c"] Apr 28 19:28:42.910110 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:28:42.910084 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bd0ebf6_fcf4_4f72_825f_be5e630230a3.slice/crio-156680a9d557e325b678d9e3c31111175acc74e41ad95420e71f2011d5101aef WatchSource:0}: Error finding container 156680a9d557e325b678d9e3c31111175acc74e41ad95420e71f2011d5101aef: Status 404 returned error can't find the container with id 156680a9d557e325b678d9e3c31111175acc74e41ad95420e71f2011d5101aef Apr 28 19:28:43.022772 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:43.022731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f792c" event={"ID":"3bd0ebf6-fcf4-4f72-825f-be5e630230a3","Type":"ContainerStarted","Data":"156680a9d557e325b678d9e3c31111175acc74e41ad95420e71f2011d5101aef"} Apr 28 19:28:48.041057 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:48.041018 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f792c" event={"ID":"3bd0ebf6-fcf4-4f72-825f-be5e630230a3","Type":"ContainerStarted","Data":"6fe820cd2641a15ecf905912648d14d892f72cd58b4e4006310a6be3451d5d7d"} Apr 28 19:28:48.057046 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:48.056997 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-f792c" podStartSLOduration=1.695305952 podStartE2EDuration="6.05698277s" podCreationTimestamp="2026-04-28 19:28:42 +0000 UTC" firstStartedPulling="2026-04-28 19:28:42.9118876 +0000 UTC m=+749.663437093" lastFinishedPulling="2026-04-28 19:28:47.273564406 +0000 UTC m=+754.025113911" observedRunningTime="2026-04-28 19:28:48.055298633 +0000 UTC m=+754.806848148" watchObservedRunningTime="2026-04-28 19:28:48.05698277 +0000 UTC m=+754.808532285" Apr 28 19:28:51.051342 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:51.051309 2570 generic.go:358] "Generic (PLEG): container finished" podID="3bd0ebf6-fcf4-4f72-825f-be5e630230a3" containerID="6fe820cd2641a15ecf905912648d14d892f72cd58b4e4006310a6be3451d5d7d" exitCode=0 Apr 28 19:28:51.051816 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:51.051385 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f792c" event={"ID":"3bd0ebf6-fcf4-4f72-825f-be5e630230a3","Type":"ContainerDied","Data":"6fe820cd2641a15ecf905912648d14d892f72cd58b4e4006310a6be3451d5d7d"} Apr 28 19:28:52.179030 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:52.179004 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f792c" Apr 28 19:28:52.230092 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:52.230058 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7vdf\" (UniqueName: \"kubernetes.io/projected/3bd0ebf6-fcf4-4f72-825f-be5e630230a3-kube-api-access-l7vdf\") pod \"3bd0ebf6-fcf4-4f72-825f-be5e630230a3\" (UID: \"3bd0ebf6-fcf4-4f72-825f-be5e630230a3\") " Apr 28 19:28:52.232049 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:52.232022 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd0ebf6-fcf4-4f72-825f-be5e630230a3-kube-api-access-l7vdf" (OuterVolumeSpecName: "kube-api-access-l7vdf") pod "3bd0ebf6-fcf4-4f72-825f-be5e630230a3" (UID: "3bd0ebf6-fcf4-4f72-825f-be5e630230a3"). InnerVolumeSpecName "kube-api-access-l7vdf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:28:52.330780 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:52.330691 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7vdf\" (UniqueName: \"kubernetes.io/projected/3bd0ebf6-fcf4-4f72-825f-be5e630230a3-kube-api-access-l7vdf\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:28:53.058462 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:53.058427 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f792c" event={"ID":"3bd0ebf6-fcf4-4f72-825f-be5e630230a3","Type":"ContainerDied","Data":"156680a9d557e325b678d9e3c31111175acc74e41ad95420e71f2011d5101aef"} Apr 28 19:28:53.058462 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:53.058461 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="156680a9d557e325b678d9e3c31111175acc74e41ad95420e71f2011d5101aef" Apr 28 19:28:53.058462 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:28:53.058442 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f792c" Apr 28 19:29:02.408226 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.408184 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh"] Apr 28 19:29:02.408887 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.408538 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bd0ebf6-fcf4-4f72-825f-be5e630230a3" containerName="s3-init" Apr 28 19:29:02.408887 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.408549 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd0ebf6-fcf4-4f72-825f-be5e630230a3" containerName="s3-init" Apr 28 19:29:02.408887 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.408618 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bd0ebf6-fcf4-4f72-825f-be5e630230a3" containerName="s3-init" Apr 28 19:29:02.411996 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.411968 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.414891 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.414862 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:29:02.415031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.414911 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:29:02.415031 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.414972 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e1324-kube-rbac-proxy-sar-config\"" Apr 28 19:29:02.415812 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.415790 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e1324-predictor-serving-cert\"" Apr 28 19:29:02.415903 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.415826 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-fkxbx\"" Apr 28 19:29:02.424355 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.424327 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh"] Apr 28 19:29:02.516356 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.516320 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56a93064-6013-47ba-8543-0becf0fb0fb4-error-404-isvc-e1324-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e1324-predictor-69874b497-dbzwh\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.516531 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.516375 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfxmm\" (UniqueName: \"kubernetes.io/projected/56a93064-6013-47ba-8543-0becf0fb0fb4-kube-api-access-bfxmm\") pod \"error-404-isvc-e1324-predictor-69874b497-dbzwh\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.516586 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.516548 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a93064-6013-47ba-8543-0becf0fb0fb4-proxy-tls\") pod \"error-404-isvc-e1324-predictor-69874b497-dbzwh\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.617229 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.617195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56a93064-6013-47ba-8543-0becf0fb0fb4-error-404-isvc-e1324-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e1324-predictor-69874b497-dbzwh\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.617409 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.617239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfxmm\" (UniqueName: \"kubernetes.io/projected/56a93064-6013-47ba-8543-0becf0fb0fb4-kube-api-access-bfxmm\") pod \"error-404-isvc-e1324-predictor-69874b497-dbzwh\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.617409 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.617301 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a93064-6013-47ba-8543-0becf0fb0fb4-proxy-tls\") pod \"error-404-isvc-e1324-predictor-69874b497-dbzwh\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.617957 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.617937 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56a93064-6013-47ba-8543-0becf0fb0fb4-error-404-isvc-e1324-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e1324-predictor-69874b497-dbzwh\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.619621 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.619590 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a93064-6013-47ba-8543-0becf0fb0fb4-proxy-tls\") pod \"error-404-isvc-e1324-predictor-69874b497-dbzwh\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.625225 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.625199 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfxmm\" (UniqueName: \"kubernetes.io/projected/56a93064-6013-47ba-8543-0becf0fb0fb4-kube-api-access-bfxmm\") pod \"error-404-isvc-e1324-predictor-69874b497-dbzwh\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.655102 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.655071 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj"] Apr 28 19:29:02.658914 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.658867 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.661305 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.661281 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 28 19:29:02.661406 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.661368 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 28 19:29:02.674865 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.674822 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj"] Apr 28 19:29:02.717952 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.717913 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/273c4306-c7c4-4a66-98bd-d68e8649ae68-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.717952 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.717955 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/273c4306-c7c4-4a66-98bd-d68e8649ae68-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.718151 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.718038 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/273c4306-c7c4-4a66-98bd-d68e8649ae68-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.718151 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.718076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qdtq\" (UniqueName: \"kubernetes.io/projected/273c4306-c7c4-4a66-98bd-d68e8649ae68-kube-api-access-4qdtq\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.724826 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.724785 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:02.819264 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.819219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/273c4306-c7c4-4a66-98bd-d68e8649ae68-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.819511 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.819319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qdtq\" (UniqueName: \"kubernetes.io/projected/273c4306-c7c4-4a66-98bd-d68e8649ae68-kube-api-access-4qdtq\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.819511 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.819351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/273c4306-c7c4-4a66-98bd-d68e8649ae68-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.819511 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.819385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/273c4306-c7c4-4a66-98bd-d68e8649ae68-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.819798 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.819773 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/273c4306-c7c4-4a66-98bd-d68e8649ae68-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.819987 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.819966 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/273c4306-c7c4-4a66-98bd-d68e8649ae68-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.823036 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.823007 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/273c4306-c7c4-4a66-98bd-d68e8649ae68-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.826987 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.826962 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qdtq\" (UniqueName: \"kubernetes.io/projected/273c4306-c7c4-4a66-98bd-d68e8649ae68-kube-api-access-4qdtq\") pod \"isvc-xgboost-graph-predictor-669d8d6456-pcrdj\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:02.850590 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.850564 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh"] Apr 28 19:29:02.853224 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:29:02.853187 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56a93064_6013_47ba_8543_0becf0fb0fb4.slice/crio-af804606776ebfa20ef4978ccc9f43b0c395378b69f42f89b21ffe264fa3295b WatchSource:0}: Error finding container af804606776ebfa20ef4978ccc9f43b0c395378b69f42f89b21ffe264fa3295b: Status 404 returned error can't find the container with id af804606776ebfa20ef4978ccc9f43b0c395378b69f42f89b21ffe264fa3295b Apr 28 19:29:02.971963 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:02.971867 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:03.089515 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.089479 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" event={"ID":"56a93064-6013-47ba-8543-0becf0fb0fb4","Type":"ContainerStarted","Data":"af804606776ebfa20ef4978ccc9f43b0c395378b69f42f89b21ffe264fa3295b"} Apr 28 19:29:03.101258 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.101232 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj"] Apr 28 19:29:03.103415 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:29:03.103388 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod273c4306_c7c4_4a66_98bd_d68e8649ae68.slice/crio-860371165f71bfd85a004622ebff9c5ae25ee5bebe172e5be2f5e7d4d5647d8c WatchSource:0}: Error finding container 860371165f71bfd85a004622ebff9c5ae25ee5bebe172e5be2f5e7d4d5647d8c: Status 404 returned error can't find the container with id 860371165f71bfd85a004622ebff9c5ae25ee5bebe172e5be2f5e7d4d5647d8c Apr 28 19:29:03.255591 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.255565 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl"] Apr 28 19:29:03.260399 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.260381 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.262766 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.262745 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 28 19:29:03.262873 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.262749 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 28 19:29:03.269256 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.269232 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl"] Apr 28 19:29:03.323691 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.323645 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a5b296b-494e-46cf-8143-4b52f1b06ad2-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.323882 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.323714 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.323882 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.323776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a5b296b-494e-46cf-8143-4b52f1b06ad2-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.323882 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.323800 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvlcr\" (UniqueName: \"kubernetes.io/projected/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kube-api-access-qvlcr\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.425190 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.425149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a5b296b-494e-46cf-8143-4b52f1b06ad2-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.425691 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.425216 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.425691 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.425277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a5b296b-494e-46cf-8143-4b52f1b06ad2-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.425691 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.425306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvlcr\" (UniqueName: \"kubernetes.io/projected/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kube-api-access-qvlcr\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.425867 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.425751 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.426201 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.426174 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a5b296b-494e-46cf-8143-4b52f1b06ad2-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.428207 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.428179 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a5b296b-494e-46cf-8143-4b52f1b06ad2-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.434556 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.434531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvlcr\" (UniqueName: \"kubernetes.io/projected/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kube-api-access-qvlcr\") pod \"isvc-sklearn-graph-2-predictor-67687994d4-tgjwl\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.572065 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.571977 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:03.779295 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:03.779270 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl"] Apr 28 19:29:03.783487 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:29:03.783449 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5b296b_494e_46cf_8143_4b52f1b06ad2.slice/crio-4ae5978e5df2803e65e878a07aedfba7b423d4daa55c11e90c019bdb5da8835b WatchSource:0}: Error finding container 4ae5978e5df2803e65e878a07aedfba7b423d4daa55c11e90c019bdb5da8835b: Status 404 returned error can't find the container with id 4ae5978e5df2803e65e878a07aedfba7b423d4daa55c11e90c019bdb5da8835b Apr 28 19:29:04.100163 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:04.100102 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" event={"ID":"273c4306-c7c4-4a66-98bd-d68e8649ae68","Type":"ContainerStarted","Data":"860371165f71bfd85a004622ebff9c5ae25ee5bebe172e5be2f5e7d4d5647d8c"} Apr 28 19:29:04.105714 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:04.105650 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" event={"ID":"6a5b296b-494e-46cf-8143-4b52f1b06ad2","Type":"ContainerStarted","Data":"4ae5978e5df2803e65e878a07aedfba7b423d4daa55c11e90c019bdb5da8835b"} Apr 28 19:29:17.158775 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:17.158731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" event={"ID":"6a5b296b-494e-46cf-8143-4b52f1b06ad2","Type":"ContainerStarted","Data":"20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd"} Apr 28 19:29:17.160231 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:17.160199 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" event={"ID":"273c4306-c7c4-4a66-98bd-d68e8649ae68","Type":"ContainerStarted","Data":"5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172"} Apr 28 19:29:18.166562 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:18.166519 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" event={"ID":"56a93064-6013-47ba-8543-0becf0fb0fb4","Type":"ContainerStarted","Data":"60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91"} Apr 28 19:29:20.177294 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:20.177200 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" event={"ID":"56a93064-6013-47ba-8543-0becf0fb0fb4","Type":"ContainerStarted","Data":"f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b"} Apr 28 19:29:20.177742 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:20.177421 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:20.177742 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:20.177451 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:20.178742 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:20.178697 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 28 19:29:20.194725 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:20.194671 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podStartSLOduration=1.130259037 podStartE2EDuration="18.194656128s" podCreationTimestamp="2026-04-28 19:29:02 +0000 UTC" firstStartedPulling="2026-04-28 19:29:02.855406848 +0000 UTC m=+769.606956341" lastFinishedPulling="2026-04-28 19:29:19.919803932 +0000 UTC m=+786.671353432" observedRunningTime="2026-04-28 19:29:20.194008186 +0000 UTC m=+786.945557698" watchObservedRunningTime="2026-04-28 19:29:20.194656128 +0000 UTC m=+786.946205644" Apr 28 19:29:21.181505 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:21.181473 2570 generic.go:358] "Generic (PLEG): container finished" podID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerID="20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd" exitCode=0 Apr 28 19:29:21.182046 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:21.181552 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" event={"ID":"6a5b296b-494e-46cf-8143-4b52f1b06ad2","Type":"ContainerDied","Data":"20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd"} Apr 28 19:29:21.188479 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:21.188455 2570 generic.go:358] "Generic (PLEG): container finished" podID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerID="5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172" exitCode=0 Apr 28 19:29:21.188576 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:21.188507 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" event={"ID":"273c4306-c7c4-4a66-98bd-d68e8649ae68","Type":"ContainerDied","Data":"5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172"} Apr 28 19:29:21.189134 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:21.189103 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 28 19:29:26.195249 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:26.194638 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:29:26.195249 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:26.195089 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 28 19:29:28.222460 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:28.222376 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" event={"ID":"6a5b296b-494e-46cf-8143-4b52f1b06ad2","Type":"ContainerStarted","Data":"2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce"} Apr 28 19:29:28.222460 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:28.222428 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" event={"ID":"6a5b296b-494e-46cf-8143-4b52f1b06ad2","Type":"ContainerStarted","Data":"d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133"} Apr 28 19:29:28.222950 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:28.222771 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:28.222950 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:28.222915 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:28.224589 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:28.224561 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:29:28.243948 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:28.243897 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podStartSLOduration=1.176818768 podStartE2EDuration="25.24388515s" podCreationTimestamp="2026-04-28 19:29:03 +0000 UTC" firstStartedPulling="2026-04-28 19:29:03.785920021 +0000 UTC m=+770.537469520" lastFinishedPulling="2026-04-28 19:29:27.852986396 +0000 UTC m=+794.604535902" observedRunningTime="2026-04-28 19:29:28.241374189 +0000 UTC m=+794.992923716" watchObservedRunningTime="2026-04-28 19:29:28.24388515 +0000 UTC m=+794.995434665" Apr 28 19:29:29.226405 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:29.226354 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:29:34.231651 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:34.231555 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:29:34.232225 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:34.232192 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:29:36.195735 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:36.195691 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 28 19:29:41.275750 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:41.275707 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" event={"ID":"273c4306-c7c4-4a66-98bd-d68e8649ae68","Type":"ContainerStarted","Data":"28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea"} Apr 28 19:29:41.275750 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:41.275749 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" event={"ID":"273c4306-c7c4-4a66-98bd-d68e8649ae68","Type":"ContainerStarted","Data":"4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04"} Apr 28 19:29:41.276300 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:41.275979 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:41.298033 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:41.297971 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podStartSLOduration=1.533421854 podStartE2EDuration="39.297953417s" podCreationTimestamp="2026-04-28 19:29:02 +0000 UTC" firstStartedPulling="2026-04-28 19:29:03.105353036 +0000 UTC m=+769.856902528" lastFinishedPulling="2026-04-28 19:29:40.869884594 +0000 UTC m=+807.621434091" observedRunningTime="2026-04-28 19:29:41.296597439 +0000 UTC m=+808.048146955" watchObservedRunningTime="2026-04-28 19:29:41.297953417 +0000 UTC m=+808.049502934" Apr 28 19:29:42.278917 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:42.278885 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:42.280222 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:42.280183 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 28 19:29:43.281782 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:43.281744 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 28 19:29:44.232983 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:44.232945 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:29:46.196160 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:46.196114 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 28 19:29:48.290002 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:48.289973 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:29:48.290599 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:48.290569 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 28 19:29:54.233011 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:54.232973 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:29:56.195889 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:56.195848 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 28 19:29:58.291052 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:29:58.291012 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 28 19:30:04.232277 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:04.232234 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:30:06.195630 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:06.195574 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 28 19:30:08.290659 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:08.290596 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 28 19:30:14.232837 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:14.232791 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:30:16.195767 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:16.195736 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:30:18.291421 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:18.291379 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 28 19:30:24.232600 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:24.232559 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:30:28.291421 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:28.291375 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 28 19:30:32.533546 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.533509 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh"] Apr 28 19:30:32.533950 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.533804 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" containerID="cri-o://60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91" gracePeriod=30 Apr 28 19:30:32.533950 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.533838 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kube-rbac-proxy" containerID="cri-o://f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b" gracePeriod=30 Apr 28 19:30:32.693588 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.693549 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24"] Apr 28 19:30:32.697205 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.697177 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:32.700262 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.700235 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-0600b-kube-rbac-proxy-sar-config\"" Apr 28 19:30:32.700398 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.700271 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-0600b-predictor-serving-cert\"" Apr 28 19:30:32.708892 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.708869 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24"] Apr 28 19:30:32.749574 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.749543 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a144135-6c0c-4cde-9faa-c8676d399381-error-404-isvc-0600b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0600b-predictor-cf96ff6c-ccj24\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:32.749741 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.749685 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffknb\" (UniqueName: \"kubernetes.io/projected/8a144135-6c0c-4cde-9faa-c8676d399381-kube-api-access-ffknb\") pod \"error-404-isvc-0600b-predictor-cf96ff6c-ccj24\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:32.749795 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.749774 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a144135-6c0c-4cde-9faa-c8676d399381-proxy-tls\") pod \"error-404-isvc-0600b-predictor-cf96ff6c-ccj24\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:32.850476 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.850386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a144135-6c0c-4cde-9faa-c8676d399381-proxy-tls\") pod \"error-404-isvc-0600b-predictor-cf96ff6c-ccj24\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:32.850476 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.850438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a144135-6c0c-4cde-9faa-c8676d399381-error-404-isvc-0600b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0600b-predictor-cf96ff6c-ccj24\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:32.850692 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.850493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffknb\" (UniqueName: \"kubernetes.io/projected/8a144135-6c0c-4cde-9faa-c8676d399381-kube-api-access-ffknb\") pod \"error-404-isvc-0600b-predictor-cf96ff6c-ccj24\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:32.851179 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.851157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a144135-6c0c-4cde-9faa-c8676d399381-error-404-isvc-0600b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0600b-predictor-cf96ff6c-ccj24\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:32.852878 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.852858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a144135-6c0c-4cde-9faa-c8676d399381-proxy-tls\") pod \"error-404-isvc-0600b-predictor-cf96ff6c-ccj24\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:32.859123 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:32.859102 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffknb\" (UniqueName: \"kubernetes.io/projected/8a144135-6c0c-4cde-9faa-c8676d399381-kube-api-access-ffknb\") pod \"error-404-isvc-0600b-predictor-cf96ff6c-ccj24\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:33.008397 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:33.008351 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:33.141787 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:33.141759 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24"] Apr 28 19:30:33.143538 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:30:33.143514 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a144135_6c0c_4cde_9faa_c8676d399381.slice/crio-19f27febe927e96f71313cb45c7d35385e6a3c7aa43f28257e4ae82e558344ac WatchSource:0}: Error finding container 19f27febe927e96f71313cb45c7d35385e6a3c7aa43f28257e4ae82e558344ac: Status 404 returned error can't find the container with id 19f27febe927e96f71313cb45c7d35385e6a3c7aa43f28257e4ae82e558344ac Apr 28 19:30:33.461193 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:33.461088 2570 generic.go:358] "Generic (PLEG): container finished" podID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerID="f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b" exitCode=2 Apr 28 19:30:33.461193 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:33.461170 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" event={"ID":"56a93064-6013-47ba-8543-0becf0fb0fb4","Type":"ContainerDied","Data":"f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b"} Apr 28 19:30:33.462685 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:33.462659 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" event={"ID":"8a144135-6c0c-4cde-9faa-c8676d399381","Type":"ContainerStarted","Data":"39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a"} Apr 28 19:30:33.462685 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:33.462688 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" event={"ID":"8a144135-6c0c-4cde-9faa-c8676d399381","Type":"ContainerStarted","Data":"9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8"} Apr 28 19:30:33.462871 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:33.462697 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" event={"ID":"8a144135-6c0c-4cde-9faa-c8676d399381","Type":"ContainerStarted","Data":"19f27febe927e96f71313cb45c7d35385e6a3c7aa43f28257e4ae82e558344ac"} Apr 28 19:30:33.462871 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:33.462845 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:33.481215 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:33.481165 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podStartSLOduration=1.481152792 podStartE2EDuration="1.481152792s" podCreationTimestamp="2026-04-28 19:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:30:33.478882108 +0000 UTC m=+860.230431634" watchObservedRunningTime="2026-04-28 19:30:33.481152792 +0000 UTC m=+860.232702307" Apr 28 19:30:34.232703 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:34.232663 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:30:34.466091 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:34.466055 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:34.467657 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:34.467601 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 28 19:30:35.469077 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:35.469036 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 28 19:30:36.189329 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.189283 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.28:8643/healthz\": dial tcp 10.132.0.28:8643: connect: connection refused" Apr 28 19:30:36.195642 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.195591 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 28 19:30:36.384041 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.384013 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:30:36.473946 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.473864 2570 generic.go:358] "Generic (PLEG): container finished" podID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerID="60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91" exitCode=0 Apr 28 19:30:36.473946 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.473935 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" Apr 28 19:30:36.474365 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.473943 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" event={"ID":"56a93064-6013-47ba-8543-0becf0fb0fb4","Type":"ContainerDied","Data":"60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91"} Apr 28 19:30:36.474365 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.473980 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh" event={"ID":"56a93064-6013-47ba-8543-0becf0fb0fb4","Type":"ContainerDied","Data":"af804606776ebfa20ef4978ccc9f43b0c395378b69f42f89b21ffe264fa3295b"} Apr 28 19:30:36.474365 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.473997 2570 scope.go:117] "RemoveContainer" containerID="f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b" Apr 28 19:30:36.482534 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.482521 2570 scope.go:117] "RemoveContainer" containerID="60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91" Apr 28 19:30:36.483427 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.483408 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a93064-6013-47ba-8543-0becf0fb0fb4-proxy-tls\") pod \"56a93064-6013-47ba-8543-0becf0fb0fb4\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " Apr 28 19:30:36.483495 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.483457 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56a93064-6013-47ba-8543-0becf0fb0fb4-error-404-isvc-e1324-kube-rbac-proxy-sar-config\") pod \"56a93064-6013-47ba-8543-0becf0fb0fb4\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " Apr 28 19:30:36.483542 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.483496 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfxmm\" (UniqueName: \"kubernetes.io/projected/56a93064-6013-47ba-8543-0becf0fb0fb4-kube-api-access-bfxmm\") pod \"56a93064-6013-47ba-8543-0becf0fb0fb4\" (UID: \"56a93064-6013-47ba-8543-0becf0fb0fb4\") " Apr 28 19:30:36.483923 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.483891 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a93064-6013-47ba-8543-0becf0fb0fb4-error-404-isvc-e1324-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-e1324-kube-rbac-proxy-sar-config") pod "56a93064-6013-47ba-8543-0becf0fb0fb4" (UID: "56a93064-6013-47ba-8543-0becf0fb0fb4"). InnerVolumeSpecName "error-404-isvc-e1324-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:30:36.485577 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.485551 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a93064-6013-47ba-8543-0becf0fb0fb4-kube-api-access-bfxmm" (OuterVolumeSpecName: "kube-api-access-bfxmm") pod "56a93064-6013-47ba-8543-0becf0fb0fb4" (UID: "56a93064-6013-47ba-8543-0becf0fb0fb4"). InnerVolumeSpecName "kube-api-access-bfxmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:30:36.485724 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.485700 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a93064-6013-47ba-8543-0becf0fb0fb4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "56a93064-6013-47ba-8543-0becf0fb0fb4" (UID: "56a93064-6013-47ba-8543-0becf0fb0fb4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:30:36.502494 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.502475 2570 scope.go:117] "RemoveContainer" containerID="f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b" Apr 28 19:30:36.502834 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:30:36.502812 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b\": container with ID starting with f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b not found: ID does not exist" containerID="f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b" Apr 28 19:30:36.502913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.502844 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b"} err="failed to get container status \"f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b\": rpc error: code = NotFound desc = could not find container \"f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b\": container with ID starting with f02ad7ede69c55fe40f6f2cde5fc0091af46545a3ba1d1a6f2f7696cabe1512b not found: ID does not exist" Apr 28 19:30:36.502913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.502867 2570 scope.go:117] "RemoveContainer" containerID="60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91" Apr 28 19:30:36.503159 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:30:36.503132 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91\": container with ID starting with 60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91 not found: ID does not exist" containerID="60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91" Apr 28 19:30:36.503265 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.503163 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91"} err="failed to get container status \"60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91\": rpc error: code = NotFound desc = could not find container \"60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91\": container with ID starting with 60872bbf20672b163f9c9fed0e66ddc535de6c2fe59f3a379caacd26aa706f91 not found: ID does not exist" Apr 28 19:30:36.584235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.584194 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a93064-6013-47ba-8543-0becf0fb0fb4-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:30:36.584235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.584226 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/56a93064-6013-47ba-8543-0becf0fb0fb4-error-404-isvc-e1324-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:30:36.584235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.584241 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bfxmm\" (UniqueName: \"kubernetes.io/projected/56a93064-6013-47ba-8543-0becf0fb0fb4-kube-api-access-bfxmm\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:30:36.794667 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.794636 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh"] Apr 28 19:30:36.801051 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:36.801023 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1324-predictor-69874b497-dbzwh"] Apr 28 19:30:37.889485 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:37.889446 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" path="/var/lib/kubelet/pods/56a93064-6013-47ba-8543-0becf0fb0fb4/volumes" Apr 28 19:30:38.291533 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:38.291488 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 28 19:30:40.473455 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:40.473427 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:30:40.473991 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:40.473966 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 28 19:30:44.232754 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:44.232725 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:30:48.291660 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:48.291599 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:30:50.474023 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:30:50.473985 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 28 19:31:00.473997 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:00.473957 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 28 19:31:10.474408 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:10.474366 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 28 19:31:12.226160 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.226125 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl"] Apr 28 19:31:12.227009 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.226974 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" containerID="cri-o://d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133" gracePeriod=30 Apr 28 19:31:12.227139 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.227023 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kube-rbac-proxy" containerID="cri-o://2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce" gracePeriod=30 Apr 28 19:31:12.379910 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.379870 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj"] Apr 28 19:31:12.380279 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.380243 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" containerID="cri-o://4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04" gracePeriod=30 Apr 28 19:31:12.380492 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.380305 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kube-rbac-proxy" containerID="cri-o://28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea" gracePeriod=30 Apr 28 19:31:12.394440 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.394414 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7"] Apr 28 19:31:12.394871 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.394854 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kube-rbac-proxy" Apr 28 19:31:12.394961 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.394874 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kube-rbac-proxy" Apr 28 19:31:12.394961 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.394928 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" Apr 28 19:31:12.394961 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.394938 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" Apr 28 19:31:12.395165 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.395018 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kube-rbac-proxy" Apr 28 19:31:12.395165 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.395035 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="56a93064-6013-47ba-8543-0becf0fb0fb4" containerName="kserve-container" Apr 28 19:31:12.398169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.398151 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:12.402003 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.401976 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6c7b3-predictor-serving-cert\"" Apr 28 19:31:12.402003 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.401976 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\"" Apr 28 19:31:12.409854 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.409827 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7"] Apr 28 19:31:12.483638 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.483538 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad932821-ff2d-48e0-bc3e-f385f539f10c-error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:12.483638 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.483577 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld685\" (UniqueName: \"kubernetes.io/projected/ad932821-ff2d-48e0-bc3e-f385f539f10c-kube-api-access-ld685\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:12.483638 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.483626 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad932821-ff2d-48e0-bc3e-f385f539f10c-proxy-tls\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:12.584974 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.584939 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad932821-ff2d-48e0-bc3e-f385f539f10c-proxy-tls\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:12.585146 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.585037 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad932821-ff2d-48e0-bc3e-f385f539f10c-error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:12.585146 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.585061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld685\" (UniqueName: \"kubernetes.io/projected/ad932821-ff2d-48e0-bc3e-f385f539f10c-kube-api-access-ld685\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:12.585146 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:31:12.585091 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-serving-cert: secret "error-404-isvc-6c7b3-predictor-serving-cert" not found Apr 28 19:31:12.585292 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:31:12.585159 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad932821-ff2d-48e0-bc3e-f385f539f10c-proxy-tls podName:ad932821-ff2d-48e0-bc3e-f385f539f10c nodeName:}" failed. No retries permitted until 2026-04-28 19:31:13.085142705 +0000 UTC m=+899.836692197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ad932821-ff2d-48e0-bc3e-f385f539f10c-proxy-tls") pod "error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" (UID: "ad932821-ff2d-48e0-bc3e-f385f539f10c") : secret "error-404-isvc-6c7b3-predictor-serving-cert" not found Apr 28 19:31:12.585708 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.585687 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad932821-ff2d-48e0-bc3e-f385f539f10c-error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:12.595421 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.595394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld685\" (UniqueName: \"kubernetes.io/projected/ad932821-ff2d-48e0-bc3e-f385f539f10c-kube-api-access-ld685\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:12.603801 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.603770 2570 generic.go:358] "Generic (PLEG): container finished" podID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerID="28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea" exitCode=2 Apr 28 19:31:12.603960 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.603849 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" event={"ID":"273c4306-c7c4-4a66-98bd-d68e8649ae68","Type":"ContainerDied","Data":"28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea"} Apr 28 19:31:12.606511 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.606483 2570 generic.go:358] "Generic (PLEG): container finished" podID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerID="2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce" exitCode=2 Apr 28 19:31:12.606647 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:12.606519 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" event={"ID":"6a5b296b-494e-46cf-8143-4b52f1b06ad2","Type":"ContainerDied","Data":"2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce"} Apr 28 19:31:13.091015 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.090973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad932821-ff2d-48e0-bc3e-f385f539f10c-proxy-tls\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:13.093373 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.093354 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad932821-ff2d-48e0-bc3e-f385f539f10c-proxy-tls\") pod \"error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:13.283033 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.282986 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.29:8643/healthz\": dial tcp 10.132.0.29:8643: connect: connection refused" Apr 28 19:31:13.309448 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.309412 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:13.431473 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.431447 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7"] Apr 28 19:31:13.434004 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:31:13.433971 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad932821_ff2d_48e0_bc3e_f385f539f10c.slice/crio-dc8b108e5e4e0b0e6ed23672acbbfab67f235e8f0c4be07d58e87375c1671a8c WatchSource:0}: Error finding container dc8b108e5e4e0b0e6ed23672acbbfab67f235e8f0c4be07d58e87375c1671a8c: Status 404 returned error can't find the container with id dc8b108e5e4e0b0e6ed23672acbbfab67f235e8f0c4be07d58e87375c1671a8c Apr 28 19:31:13.612216 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.612126 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" event={"ID":"ad932821-ff2d-48e0-bc3e-f385f539f10c","Type":"ContainerStarted","Data":"7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8"} Apr 28 19:31:13.612216 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.612170 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" event={"ID":"ad932821-ff2d-48e0-bc3e-f385f539f10c","Type":"ContainerStarted","Data":"34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5"} Apr 28 19:31:13.612216 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.612185 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" event={"ID":"ad932821-ff2d-48e0-bc3e-f385f539f10c","Type":"ContainerStarted","Data":"dc8b108e5e4e0b0e6ed23672acbbfab67f235e8f0c4be07d58e87375c1671a8c"} Apr 28 19:31:13.612442 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.612295 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:13.631078 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.631028 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podStartSLOduration=1.631014956 podStartE2EDuration="1.631014956s" podCreationTimestamp="2026-04-28 19:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:31:13.63025591 +0000 UTC m=+900.381805429" watchObservedRunningTime="2026-04-28 19:31:13.631014956 +0000 UTC m=+900.382564471" Apr 28 19:31:13.851785 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.851762 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:31:13.851877 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:13.851772 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:31:14.227433 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:14.227390 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 28 19:31:14.232859 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:14.232827 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 28 19:31:14.615411 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:14.615375 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:14.616687 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:14.616658 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 28 19:31:15.618780 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:15.618743 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 28 19:31:16.317048 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.317023 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:31:16.416949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.416860 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/273c4306-c7c4-4a66-98bd-d68e8649ae68-kserve-provision-location\") pod \"273c4306-c7c4-4a66-98bd-d68e8649ae68\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " Apr 28 19:31:16.416949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.416928 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/273c4306-c7c4-4a66-98bd-d68e8649ae68-proxy-tls\") pod \"273c4306-c7c4-4a66-98bd-d68e8649ae68\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " Apr 28 19:31:16.417176 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.416959 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/273c4306-c7c4-4a66-98bd-d68e8649ae68-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"273c4306-c7c4-4a66-98bd-d68e8649ae68\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " Apr 28 19:31:16.417176 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.416994 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qdtq\" (UniqueName: \"kubernetes.io/projected/273c4306-c7c4-4a66-98bd-d68e8649ae68-kube-api-access-4qdtq\") pod \"273c4306-c7c4-4a66-98bd-d68e8649ae68\" (UID: \"273c4306-c7c4-4a66-98bd-d68e8649ae68\") " Apr 28 19:31:16.417284 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.417182 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/273c4306-c7c4-4a66-98bd-d68e8649ae68-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "273c4306-c7c4-4a66-98bd-d68e8649ae68" (UID: "273c4306-c7c4-4a66-98bd-d68e8649ae68"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:31:16.417340 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.417313 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/273c4306-c7c4-4a66-98bd-d68e8649ae68-kserve-provision-location\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.417392 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.417359 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/273c4306-c7c4-4a66-98bd-d68e8649ae68-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "273c4306-c7c4-4a66-98bd-d68e8649ae68" (UID: "273c4306-c7c4-4a66-98bd-d68e8649ae68"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:31:16.418965 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.418942 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273c4306-c7c4-4a66-98bd-d68e8649ae68-kube-api-access-4qdtq" (OuterVolumeSpecName: "kube-api-access-4qdtq") pod "273c4306-c7c4-4a66-98bd-d68e8649ae68" (UID: "273c4306-c7c4-4a66-98bd-d68e8649ae68"). InnerVolumeSpecName "kube-api-access-4qdtq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:31:16.419045 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.418975 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273c4306-c7c4-4a66-98bd-d68e8649ae68-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "273c4306-c7c4-4a66-98bd-d68e8649ae68" (UID: "273c4306-c7c4-4a66-98bd-d68e8649ae68"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:31:16.518336 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.518302 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/273c4306-c7c4-4a66-98bd-d68e8649ae68-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.518336 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.518331 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/273c4306-c7c4-4a66-98bd-d68e8649ae68-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.518336 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.518342 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4qdtq\" (UniqueName: \"kubernetes.io/projected/273c4306-c7c4-4a66-98bd-d68e8649ae68-kube-api-access-4qdtq\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.623625 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.623578 2570 generic.go:358] "Generic (PLEG): container finished" podID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerID="4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04" exitCode=0 Apr 28 19:31:16.624001 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.623638 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" event={"ID":"273c4306-c7c4-4a66-98bd-d68e8649ae68","Type":"ContainerDied","Data":"4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04"} Apr 28 19:31:16.624001 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.623685 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" event={"ID":"273c4306-c7c4-4a66-98bd-d68e8649ae68","Type":"ContainerDied","Data":"860371165f71bfd85a004622ebff9c5ae25ee5bebe172e5be2f5e7d4d5647d8c"} Apr 28 19:31:16.624001 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.623707 2570 scope.go:117] "RemoveContainer" containerID="28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea" Apr 28 19:31:16.624001 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.623715 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj" Apr 28 19:31:16.665205 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.665179 2570 scope.go:117] "RemoveContainer" containerID="4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04" Apr 28 19:31:16.673423 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.673367 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj"] Apr 28 19:31:16.674088 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.674054 2570 scope.go:117] "RemoveContainer" containerID="5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172" Apr 28 19:31:16.679494 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.679470 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-pcrdj"] Apr 28 19:31:16.682457 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.682440 2570 scope.go:117] "RemoveContainer" containerID="28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea" Apr 28 19:31:16.682747 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:31:16.682730 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea\": container with ID starting with 28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea not found: ID does not exist" containerID="28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea" Apr 28 19:31:16.682804 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.682755 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea"} err="failed to get container status \"28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea\": rpc error: code = NotFound desc = could not find container \"28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea\": container with ID starting with 28eb2a8feda494a75efdae70542246bd9f27279bc5c61044af6087dc4666afea not found: ID does not exist" Apr 28 19:31:16.682804 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.682774 2570 scope.go:117] "RemoveContainer" containerID="4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04" Apr 28 19:31:16.683031 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:31:16.683006 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04\": container with ID starting with 4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04 not found: ID does not exist" containerID="4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04" Apr 28 19:31:16.683078 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.683041 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04"} err="failed to get container status \"4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04\": rpc error: code = NotFound desc = could not find container \"4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04\": container with ID starting with 4b02871b7bdf538a6a7bbbcb698586cbdffd843c666915cb5e4aefd0e65a0f04 not found: ID does not exist" Apr 28 19:31:16.683078 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.683064 2570 scope.go:117] "RemoveContainer" containerID="5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172" Apr 28 19:31:16.683330 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:31:16.683306 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172\": container with ID starting with 5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172 not found: ID does not exist" containerID="5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172" Apr 28 19:31:16.683382 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.683340 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172"} err="failed to get container status \"5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172\": rpc error: code = NotFound desc = could not find container \"5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172\": container with ID starting with 5942a624de4f899d6af85694659db38092a644e08f73bb3131f350bd79333172 not found: ID does not exist" Apr 28 19:31:16.754745 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.754721 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:31:16.821089 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.821052 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a5b296b-494e-46cf-8143-4b52f1b06ad2-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " Apr 28 19:31:16.821282 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.821168 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a5b296b-494e-46cf-8143-4b52f1b06ad2-proxy-tls\") pod \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " Apr 28 19:31:16.821282 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.821220 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvlcr\" (UniqueName: \"kubernetes.io/projected/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kube-api-access-qvlcr\") pod \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " Apr 28 19:31:16.821282 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.821253 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kserve-provision-location\") pod \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\" (UID: \"6a5b296b-494e-46cf-8143-4b52f1b06ad2\") " Apr 28 19:31:16.821446 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.821432 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a5b296b-494e-46cf-8143-4b52f1b06ad2-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "6a5b296b-494e-46cf-8143-4b52f1b06ad2" (UID: "6a5b296b-494e-46cf-8143-4b52f1b06ad2"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:31:16.821536 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.821511 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a5b296b-494e-46cf-8143-4b52f1b06ad2-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.821705 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.821677 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6a5b296b-494e-46cf-8143-4b52f1b06ad2" (UID: "6a5b296b-494e-46cf-8143-4b52f1b06ad2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:31:16.823302 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.823282 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a5b296b-494e-46cf-8143-4b52f1b06ad2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6a5b296b-494e-46cf-8143-4b52f1b06ad2" (UID: "6a5b296b-494e-46cf-8143-4b52f1b06ad2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:31:16.823362 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.823283 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kube-api-access-qvlcr" (OuterVolumeSpecName: "kube-api-access-qvlcr") pod "6a5b296b-494e-46cf-8143-4b52f1b06ad2" (UID: "6a5b296b-494e-46cf-8143-4b52f1b06ad2"). InnerVolumeSpecName "kube-api-access-qvlcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:31:16.922212 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.922163 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a5b296b-494e-46cf-8143-4b52f1b06ad2-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.922212 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.922204 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvlcr\" (UniqueName: \"kubernetes.io/projected/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kube-api-access-qvlcr\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.922212 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:16.922214 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a5b296b-494e-46cf-8143-4b52f1b06ad2-kserve-provision-location\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:31:17.629748 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.629714 2570 generic.go:358] "Generic (PLEG): container finished" podID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerID="d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133" exitCode=0 Apr 28 19:31:17.630158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.629770 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" event={"ID":"6a5b296b-494e-46cf-8143-4b52f1b06ad2","Type":"ContainerDied","Data":"d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133"} Apr 28 19:31:17.630158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.629790 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" Apr 28 19:31:17.630158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.629805 2570 scope.go:117] "RemoveContainer" containerID="2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce" Apr 28 19:31:17.630158 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.629795 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl" event={"ID":"6a5b296b-494e-46cf-8143-4b52f1b06ad2","Type":"ContainerDied","Data":"4ae5978e5df2803e65e878a07aedfba7b423d4daa55c11e90c019bdb5da8835b"} Apr 28 19:31:17.638380 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.638360 2570 scope.go:117] "RemoveContainer" containerID="d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133" Apr 28 19:31:17.645501 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.645481 2570 scope.go:117] "RemoveContainer" containerID="20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd" Apr 28 19:31:17.651581 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.651556 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl"] Apr 28 19:31:17.653429 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.653407 2570 scope.go:117] "RemoveContainer" containerID="2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce" Apr 28 19:31:17.653759 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:31:17.653738 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce\": container with ID starting with 2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce not found: ID does not exist" containerID="2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce" Apr 28 19:31:17.653838 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.653768 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce"} err="failed to get container status \"2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce\": rpc error: code = NotFound desc = could not find container \"2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce\": container with ID starting with 2f970832bf83cd595446054f59594c584f14c8268707eb0ba0b8088d106959ce not found: ID does not exist" Apr 28 19:31:17.653838 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.653786 2570 scope.go:117] "RemoveContainer" containerID="d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133" Apr 28 19:31:17.654068 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:31:17.654047 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133\": container with ID starting with d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133 not found: ID does not exist" containerID="d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133" Apr 28 19:31:17.654148 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.654090 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133"} err="failed to get container status \"d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133\": rpc error: code = NotFound desc = could not find container \"d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133\": container with ID starting with d4b4080ab6d8617bb9b76cc1ecf8f3fb3af61ec3664a3adfb2378b09c5749133 not found: ID does not exist" Apr 28 19:31:17.654148 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.654115 2570 scope.go:117] "RemoveContainer" containerID="20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd" Apr 28 19:31:17.654402 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:31:17.654355 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd\": container with ID starting with 20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd not found: ID does not exist" containerID="20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd" Apr 28 19:31:17.654478 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.654412 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd"} err="failed to get container status \"20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd\": rpc error: code = NotFound desc = could not find container \"20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd\": container with ID starting with 20bcf93d3e97893491fabbd9858fe840a3986f45e7a31cd75aa1e3464221b1cd not found: ID does not exist" Apr 28 19:31:17.655213 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.655194 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-67687994d4-tgjwl"] Apr 28 19:31:17.887309 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.887234 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" path="/var/lib/kubelet/pods/273c4306-c7c4-4a66-98bd-d68e8649ae68/volumes" Apr 28 19:31:17.888012 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:17.887990 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" path="/var/lib/kubelet/pods/6a5b296b-494e-46cf-8143-4b52f1b06ad2/volumes" Apr 28 19:31:20.475290 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:20.475255 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:31:20.623507 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:20.623477 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:31:20.623989 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:20.623960 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 28 19:31:30.624531 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:30.624487 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 28 19:31:40.624735 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:40.624693 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 28 19:31:50.624411 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:31:50.624371 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 28 19:32:00.624600 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:32:00.624565 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:36:13.878214 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:36:13.878178 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:36:13.879979 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:36:13.879958 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:39:47.399042 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.399011 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24"] Apr 28 19:39:47.401531 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.399272 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" containerID="cri-o://9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8" gracePeriod=30 Apr 28 19:39:47.401531 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.399317 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kube-rbac-proxy" containerID="cri-o://39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a" gracePeriod=30 Apr 28 19:39:47.472678 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.472647 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt"] Apr 28 19:39:47.473011 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473000 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" Apr 28 19:39:47.473064 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473013 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" Apr 28 19:39:47.473064 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473022 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="storage-initializer" Apr 28 19:39:47.473064 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473028 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="storage-initializer" Apr 28 19:39:47.473064 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473039 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kube-rbac-proxy" Apr 28 19:39:47.473064 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473045 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kube-rbac-proxy" Apr 28 19:39:47.473064 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473058 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kube-rbac-proxy" Apr 28 19:39:47.473064 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473065 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kube-rbac-proxy" Apr 28 19:39:47.473272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473079 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="storage-initializer" Apr 28 19:39:47.473272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473084 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="storage-initializer" Apr 28 19:39:47.473272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473090 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" Apr 28 19:39:47.473272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473095 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" Apr 28 19:39:47.473272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473142 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kserve-container" Apr 28 19:39:47.473272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473151 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="273c4306-c7c4-4a66-98bd-d68e8649ae68" containerName="kube-rbac-proxy" Apr 28 19:39:47.473272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473160 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kserve-container" Apr 28 19:39:47.473272 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.473166 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a5b296b-494e-46cf-8143-4b52f1b06ad2" containerName="kube-rbac-proxy" Apr 28 19:39:47.476293 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.476275 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.478645 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.478623 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2a210-kube-rbac-proxy-sar-config\"" Apr 28 19:39:47.478777 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.478750 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2a210-predictor-serving-cert\"" Apr 28 19:39:47.486484 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.486455 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt"] Apr 28 19:39:47.545944 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.545911 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88dd0647-a02e-49f5-8d04-2a2022ae3095-proxy-tls\") pod \"error-404-isvc-2a210-predictor-7bb85759bc-hqfwt\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.546099 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.545960 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwgmr\" (UniqueName: \"kubernetes.io/projected/88dd0647-a02e-49f5-8d04-2a2022ae3095-kube-api-access-vwgmr\") pod \"error-404-isvc-2a210-predictor-7bb85759bc-hqfwt\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.546099 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.545989 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88dd0647-a02e-49f5-8d04-2a2022ae3095-error-404-isvc-2a210-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2a210-predictor-7bb85759bc-hqfwt\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.646392 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.646355 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwgmr\" (UniqueName: \"kubernetes.io/projected/88dd0647-a02e-49f5-8d04-2a2022ae3095-kube-api-access-vwgmr\") pod \"error-404-isvc-2a210-predictor-7bb85759bc-hqfwt\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.646392 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.646396 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88dd0647-a02e-49f5-8d04-2a2022ae3095-error-404-isvc-2a210-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2a210-predictor-7bb85759bc-hqfwt\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.646597 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.646469 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88dd0647-a02e-49f5-8d04-2a2022ae3095-proxy-tls\") pod \"error-404-isvc-2a210-predictor-7bb85759bc-hqfwt\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.647177 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.647156 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88dd0647-a02e-49f5-8d04-2a2022ae3095-error-404-isvc-2a210-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2a210-predictor-7bb85759bc-hqfwt\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.648848 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.648829 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88dd0647-a02e-49f5-8d04-2a2022ae3095-proxy-tls\") pod \"error-404-isvc-2a210-predictor-7bb85759bc-hqfwt\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.654437 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.654378 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwgmr\" (UniqueName: \"kubernetes.io/projected/88dd0647-a02e-49f5-8d04-2a2022ae3095-kube-api-access-vwgmr\") pod \"error-404-isvc-2a210-predictor-7bb85759bc-hqfwt\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.787769 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.787734 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:47.909553 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.909530 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt"] Apr 28 19:39:47.911420 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:39:47.911395 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88dd0647_a02e_49f5_8d04_2a2022ae3095.slice/crio-8ea3aa85cd70e9b1634ae71334d48aba110aa348f296a5d577c7895a6fde1855 WatchSource:0}: Error finding container 8ea3aa85cd70e9b1634ae71334d48aba110aa348f296a5d577c7895a6fde1855: Status 404 returned error can't find the container with id 8ea3aa85cd70e9b1634ae71334d48aba110aa348f296a5d577c7895a6fde1855 Apr 28 19:39:47.913027 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:47.913010 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:39:48.349656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:48.349584 2570 generic.go:358] "Generic (PLEG): container finished" podID="8a144135-6c0c-4cde-9faa-c8676d399381" containerID="39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a" exitCode=2 Apr 28 19:39:48.349656 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:48.349648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" event={"ID":"8a144135-6c0c-4cde-9faa-c8676d399381","Type":"ContainerDied","Data":"39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a"} Apr 28 19:39:48.353968 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:48.353927 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" event={"ID":"88dd0647-a02e-49f5-8d04-2a2022ae3095","Type":"ContainerStarted","Data":"c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06"} Apr 28 19:39:48.353968 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:48.353968 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" event={"ID":"88dd0647-a02e-49f5-8d04-2a2022ae3095","Type":"ContainerStarted","Data":"653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2"} Apr 28 19:39:48.354180 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:48.353983 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" event={"ID":"88dd0647-a02e-49f5-8d04-2a2022ae3095","Type":"ContainerStarted","Data":"8ea3aa85cd70e9b1634ae71334d48aba110aa348f296a5d577c7895a6fde1855"} Apr 28 19:39:48.354180 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:48.354108 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:48.370642 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:48.370579 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podStartSLOduration=1.3705642359999999 podStartE2EDuration="1.370564236s" podCreationTimestamp="2026-04-28 19:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:39:48.369983602 +0000 UTC m=+1415.121533156" watchObservedRunningTime="2026-04-28 19:39:48.370564236 +0000 UTC m=+1415.122113752" Apr 28 19:39:49.357955 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:49.357924 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:49.359235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:49.359205 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 28 19:39:50.360969 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.360929 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 28 19:39:50.470120 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.470078 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 28 19:39:50.474551 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.474523 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 28 19:39:50.640187 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.640161 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:39:50.773634 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.773566 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a144135-6c0c-4cde-9faa-c8676d399381-proxy-tls\") pod \"8a144135-6c0c-4cde-9faa-c8676d399381\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " Apr 28 19:39:50.773804 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.773663 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffknb\" (UniqueName: \"kubernetes.io/projected/8a144135-6c0c-4cde-9faa-c8676d399381-kube-api-access-ffknb\") pod \"8a144135-6c0c-4cde-9faa-c8676d399381\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " Apr 28 19:39:50.773804 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.773724 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a144135-6c0c-4cde-9faa-c8676d399381-error-404-isvc-0600b-kube-rbac-proxy-sar-config\") pod \"8a144135-6c0c-4cde-9faa-c8676d399381\" (UID: \"8a144135-6c0c-4cde-9faa-c8676d399381\") " Apr 28 19:39:50.774118 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.774087 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a144135-6c0c-4cde-9faa-c8676d399381-error-404-isvc-0600b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-0600b-kube-rbac-proxy-sar-config") pod "8a144135-6c0c-4cde-9faa-c8676d399381" (UID: "8a144135-6c0c-4cde-9faa-c8676d399381"). InnerVolumeSpecName "error-404-isvc-0600b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:39:50.775741 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.775713 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a144135-6c0c-4cde-9faa-c8676d399381-kube-api-access-ffknb" (OuterVolumeSpecName: "kube-api-access-ffknb") pod "8a144135-6c0c-4cde-9faa-c8676d399381" (UID: "8a144135-6c0c-4cde-9faa-c8676d399381"). InnerVolumeSpecName "kube-api-access-ffknb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:39:50.775823 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.775761 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a144135-6c0c-4cde-9faa-c8676d399381-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8a144135-6c0c-4cde-9faa-c8676d399381" (UID: "8a144135-6c0c-4cde-9faa-c8676d399381"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:39:50.874714 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.874622 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ffknb\" (UniqueName: \"kubernetes.io/projected/8a144135-6c0c-4cde-9faa-c8676d399381-kube-api-access-ffknb\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:39:50.874714 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.874659 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a144135-6c0c-4cde-9faa-c8676d399381-error-404-isvc-0600b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:39:50.874714 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:50.874670 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a144135-6c0c-4cde-9faa-c8676d399381-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:39:51.365738 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.365700 2570 generic.go:358] "Generic (PLEG): container finished" podID="8a144135-6c0c-4cde-9faa-c8676d399381" containerID="9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8" exitCode=0 Apr 28 19:39:51.366166 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.365748 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" event={"ID":"8a144135-6c0c-4cde-9faa-c8676d399381","Type":"ContainerDied","Data":"9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8"} Apr 28 19:39:51.366166 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.365771 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" Apr 28 19:39:51.366166 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.365785 2570 scope.go:117] "RemoveContainer" containerID="39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a" Apr 28 19:39:51.366166 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.365774 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24" event={"ID":"8a144135-6c0c-4cde-9faa-c8676d399381","Type":"ContainerDied","Data":"19f27febe927e96f71313cb45c7d35385e6a3c7aa43f28257e4ae82e558344ac"} Apr 28 19:39:51.374830 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.374809 2570 scope.go:117] "RemoveContainer" containerID="9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8" Apr 28 19:39:51.381659 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.381639 2570 scope.go:117] "RemoveContainer" containerID="39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a" Apr 28 19:39:51.381904 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:39:51.381886 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a\": container with ID starting with 39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a not found: ID does not exist" containerID="39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a" Apr 28 19:39:51.381946 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.381914 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a"} err="failed to get container status \"39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a\": rpc error: code = NotFound desc = could not find container \"39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a\": container with ID starting with 39e8dd7ba5e0e7a342494ff5f6cd58bf99bac0040fdf909f1d2b84700a75f58a not found: ID does not exist" Apr 28 19:39:51.381946 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.381937 2570 scope.go:117] "RemoveContainer" containerID="9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8" Apr 28 19:39:51.382155 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:39:51.382140 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8\": container with ID starting with 9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8 not found: ID does not exist" containerID="9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8" Apr 28 19:39:51.382198 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.382160 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8"} err="failed to get container status \"9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8\": rpc error: code = NotFound desc = could not find container \"9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8\": container with ID starting with 9d9246cdb3e4b883e07802323a461940968d1333eb785578b29814a943a96ba8 not found: ID does not exist" Apr 28 19:39:51.387619 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.387588 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24"] Apr 28 19:39:51.390735 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.390715 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0600b-predictor-cf96ff6c-ccj24"] Apr 28 19:39:51.887323 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:51.887286 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" path="/var/lib/kubelet/pods/8a144135-6c0c-4cde-9faa-c8676d399381/volumes" Apr 28 19:39:55.365535 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:55.365508 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:39:55.366030 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:39:55.365957 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 28 19:40:05.366889 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:05.366806 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 28 19:40:15.366586 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:15.366541 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 28 19:40:25.366925 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:25.366884 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 28 19:40:27.301189 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.301150 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7"] Apr 28 19:40:27.301783 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.301680 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" containerID="cri-o://34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5" gracePeriod=30 Apr 28 19:40:27.301854 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.301778 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kube-rbac-proxy" containerID="cri-o://7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8" gracePeriod=30 Apr 28 19:40:27.358961 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.358920 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs"] Apr 28 19:40:27.359293 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.359281 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kube-rbac-proxy" Apr 28 19:40:27.359338 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.359295 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kube-rbac-proxy" Apr 28 19:40:27.359338 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.359307 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" Apr 28 19:40:27.359338 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.359312 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" Apr 28 19:40:27.359433 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.359370 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kserve-container" Apr 28 19:40:27.359433 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.359378 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a144135-6c0c-4cde-9faa-c8676d399381" containerName="kube-rbac-proxy" Apr 28 19:40:27.363840 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.363814 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.366517 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.366489 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-26ebe-kube-rbac-proxy-sar-config\"" Apr 28 19:40:27.369765 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.369743 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-26ebe-predictor-serving-cert\"" Apr 28 19:40:27.376450 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.376420 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs"] Apr 28 19:40:27.378169 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.378145 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c09e77c-dde3-4787-8002-d63f394b155c-proxy-tls\") pod \"error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.378300 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.378230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c09e77c-dde3-4787-8002-d63f394b155c-error-404-isvc-26ebe-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.378367 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.378303 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7szg6\" (UniqueName: \"kubernetes.io/projected/4c09e77c-dde3-4787-8002-d63f394b155c-kube-api-access-7szg6\") pod \"error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.479062 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.479025 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c09e77c-dde3-4787-8002-d63f394b155c-proxy-tls\") pod \"error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.479243 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.479107 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c09e77c-dde3-4787-8002-d63f394b155c-error-404-isvc-26ebe-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.479243 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.479176 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7szg6\" (UniqueName: \"kubernetes.io/projected/4c09e77c-dde3-4787-8002-d63f394b155c-kube-api-access-7szg6\") pod \"error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.486491 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.485286 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c09e77c-dde3-4787-8002-d63f394b155c-error-404-isvc-26ebe-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.487654 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.487576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c09e77c-dde3-4787-8002-d63f394b155c-proxy-tls\") pod \"error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.490120 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.490079 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7szg6\" (UniqueName: \"kubernetes.io/projected/4c09e77c-dde3-4787-8002-d63f394b155c-kube-api-access-7szg6\") pod \"error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.492453 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.492421 2570 generic.go:358] "Generic (PLEG): container finished" podID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerID="7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8" exitCode=2 Apr 28 19:40:27.492565 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.492463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" event={"ID":"ad932821-ff2d-48e0-bc3e-f385f539f10c","Type":"ContainerDied","Data":"7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8"} Apr 28 19:40:27.674828 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.674743 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:27.805236 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:27.805175 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs"] Apr 28 19:40:27.807620 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:40:27.807576 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c09e77c_dde3_4787_8002_d63f394b155c.slice/crio-86c0bc5e679df0683abbf3f7edaa74bd6ca5a22a4218ab25942a280950049ead WatchSource:0}: Error finding container 86c0bc5e679df0683abbf3f7edaa74bd6ca5a22a4218ab25942a280950049ead: Status 404 returned error can't find the container with id 86c0bc5e679df0683abbf3f7edaa74bd6ca5a22a4218ab25942a280950049ead Apr 28 19:40:28.499074 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:28.499031 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" event={"ID":"4c09e77c-dde3-4787-8002-d63f394b155c","Type":"ContainerStarted","Data":"7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0"} Apr 28 19:40:28.499553 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:28.499082 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" event={"ID":"4c09e77c-dde3-4787-8002-d63f394b155c","Type":"ContainerStarted","Data":"aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622"} Apr 28 19:40:28.499553 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:28.499098 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" event={"ID":"4c09e77c-dde3-4787-8002-d63f394b155c","Type":"ContainerStarted","Data":"86c0bc5e679df0683abbf3f7edaa74bd6ca5a22a4218ab25942a280950049ead"} Apr 28 19:40:28.499553 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:28.499173 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:28.517946 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:28.517894 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podStartSLOduration=1.51787833 podStartE2EDuration="1.51787833s" podCreationTimestamp="2026-04-28 19:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:40:28.516515633 +0000 UTC m=+1455.268065173" watchObservedRunningTime="2026-04-28 19:40:28.51787833 +0000 UTC m=+1455.269427844" Apr 28 19:40:29.504033 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:29.503998 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:29.505428 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:29.505393 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 28 19:40:30.509347 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.509308 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 28 19:40:30.660427 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.660399 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:40:30.704600 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.704567 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad932821-ff2d-48e0-bc3e-f385f539f10c-error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\") pod \"ad932821-ff2d-48e0-bc3e-f385f539f10c\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " Apr 28 19:40:30.704767 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.704692 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld685\" (UniqueName: \"kubernetes.io/projected/ad932821-ff2d-48e0-bc3e-f385f539f10c-kube-api-access-ld685\") pod \"ad932821-ff2d-48e0-bc3e-f385f539f10c\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " Apr 28 19:40:30.704767 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.704715 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad932821-ff2d-48e0-bc3e-f385f539f10c-proxy-tls\") pod \"ad932821-ff2d-48e0-bc3e-f385f539f10c\" (UID: \"ad932821-ff2d-48e0-bc3e-f385f539f10c\") " Apr 28 19:40:30.705040 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.705007 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad932821-ff2d-48e0-bc3e-f385f539f10c-error-404-isvc-6c7b3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-6c7b3-kube-rbac-proxy-sar-config") pod "ad932821-ff2d-48e0-bc3e-f385f539f10c" (UID: "ad932821-ff2d-48e0-bc3e-f385f539f10c"). InnerVolumeSpecName "error-404-isvc-6c7b3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:40:30.706853 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.706822 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad932821-ff2d-48e0-bc3e-f385f539f10c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ad932821-ff2d-48e0-bc3e-f385f539f10c" (UID: "ad932821-ff2d-48e0-bc3e-f385f539f10c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:40:30.706853 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.706837 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad932821-ff2d-48e0-bc3e-f385f539f10c-kube-api-access-ld685" (OuterVolumeSpecName: "kube-api-access-ld685") pod "ad932821-ff2d-48e0-bc3e-f385f539f10c" (UID: "ad932821-ff2d-48e0-bc3e-f385f539f10c"). InnerVolumeSpecName "kube-api-access-ld685". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:40:30.805825 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.805792 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ld685\" (UniqueName: \"kubernetes.io/projected/ad932821-ff2d-48e0-bc3e-f385f539f10c-kube-api-access-ld685\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:40:30.805825 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.805824 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad932821-ff2d-48e0-bc3e-f385f539f10c-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:40:30.806028 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:30.805837 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad932821-ff2d-48e0-bc3e-f385f539f10c-error-404-isvc-6c7b3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:40:31.513933 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.513893 2570 generic.go:358] "Generic (PLEG): container finished" podID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerID="34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5" exitCode=0 Apr 28 19:40:31.514333 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.513948 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" event={"ID":"ad932821-ff2d-48e0-bc3e-f385f539f10c","Type":"ContainerDied","Data":"34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5"} Apr 28 19:40:31.514333 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.513967 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" Apr 28 19:40:31.514333 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.513984 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" event={"ID":"ad932821-ff2d-48e0-bc3e-f385f539f10c","Type":"ContainerDied","Data":"dc8b108e5e4e0b0e6ed23672acbbfab67f235e8f0c4be07d58e87375c1671a8c"} Apr 28 19:40:31.514333 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.514000 2570 scope.go:117] "RemoveContainer" containerID="7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8" Apr 28 19:40:31.522953 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.522931 2570 scope.go:117] "RemoveContainer" containerID="34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5" Apr 28 19:40:31.530191 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.530173 2570 scope.go:117] "RemoveContainer" containerID="7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8" Apr 28 19:40:31.530429 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:40:31.530412 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8\": container with ID starting with 7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8 not found: ID does not exist" containerID="7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8" Apr 28 19:40:31.530468 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.530438 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8"} err="failed to get container status \"7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8\": rpc error: code = NotFound desc = could not find container \"7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8\": container with ID starting with 7ae526714e52efb8ec0a87caaa34f82f84218f59653783e48f9bc8cfd52a7ef8 not found: ID does not exist" Apr 28 19:40:31.530468 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.530457 2570 scope.go:117] "RemoveContainer" containerID="34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5" Apr 28 19:40:31.530679 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:40:31.530665 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5\": container with ID starting with 34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5 not found: ID does not exist" containerID="34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5" Apr 28 19:40:31.530715 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.530684 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5"} err="failed to get container status \"34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5\": rpc error: code = NotFound desc = could not find container \"34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5\": container with ID starting with 34895054bfa426e39bc570015153177095e5d1465582b62b24017ff45aff02f5 not found: ID does not exist" Apr 28 19:40:31.535204 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.535181 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7"] Apr 28 19:40:31.543261 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.543241 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7"] Apr 28 19:40:31.620004 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.619954 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": context deadline exceeded" Apr 28 19:40:31.624555 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.624533 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6c7b3-predictor-5b9464bf9c-l89s7" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: i/o timeout" Apr 28 19:40:31.887673 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:31.887567 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" path="/var/lib/kubelet/pods/ad932821-ff2d-48e0-bc3e-f385f539f10c/volumes" Apr 28 19:40:35.366763 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:35.366734 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:40:35.514198 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:35.514167 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:40:35.514601 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:35.514574 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 28 19:40:45.515007 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:45.514967 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 28 19:40:55.515190 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:55.515150 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 28 19:40:57.701838 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.701801 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt"] Apr 28 19:40:57.702248 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.702054 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" containerID="cri-o://653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2" gracePeriod=30 Apr 28 19:40:57.702248 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.702115 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kube-rbac-proxy" containerID="cri-o://c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06" gracePeriod=30 Apr 28 19:40:57.767229 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.767196 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd"] Apr 28 19:40:57.767565 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.767552 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" Apr 28 19:40:57.767638 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.767567 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" Apr 28 19:40:57.767638 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.767580 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kube-rbac-proxy" Apr 28 19:40:57.767638 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.767586 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kube-rbac-proxy" Apr 28 19:40:57.767734 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.767662 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kube-rbac-proxy" Apr 28 19:40:57.767734 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.767674 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad932821-ff2d-48e0-bc3e-f385f539f10c" containerName="kserve-container" Apr 28 19:40:57.770747 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.770729 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:57.773213 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.773192 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-0ca59-predictor-serving-cert\"" Apr 28 19:40:57.773537 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.773519 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-0ca59-kube-rbac-proxy-sar-config\"" Apr 28 19:40:57.779273 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.779242 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd"] Apr 28 19:40:57.834332 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.834295 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-error-404-isvc-0ca59-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:57.834477 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.834422 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:57.834477 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.834455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7nlv\" (UniqueName: \"kubernetes.io/projected/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-kube-api-access-t7nlv\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:57.935598 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.935560 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-error-404-isvc-0ca59-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:57.935807 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.935697 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:57.935807 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.935726 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7nlv\" (UniqueName: \"kubernetes.io/projected/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-kube-api-access-t7nlv\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:57.935930 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:40:57.935812 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-serving-cert: secret "error-404-isvc-0ca59-predictor-serving-cert" not found Apr 28 19:40:57.935930 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:40:57.935891 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls podName:f95b93a4-6369-4e6c-a06a-60d166c6ce2a nodeName:}" failed. No retries permitted until 2026-04-28 19:40:58.435869374 +0000 UTC m=+1485.187418884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls") pod "error-404-isvc-0ca59-predictor-546755f487-skcxd" (UID: "f95b93a4-6369-4e6c-a06a-60d166c6ce2a") : secret "error-404-isvc-0ca59-predictor-serving-cert" not found Apr 28 19:40:57.936302 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.936281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-error-404-isvc-0ca59-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:57.946751 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:57.946731 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7nlv\" (UniqueName: \"kubernetes.io/projected/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-kube-api-access-t7nlv\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:58.440570 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:58.440530 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:58.440882 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:40:58.440731 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-serving-cert: secret "error-404-isvc-0ca59-predictor-serving-cert" not found Apr 28 19:40:58.440882 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:40:58.440830 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls podName:f95b93a4-6369-4e6c-a06a-60d166c6ce2a nodeName:}" failed. No retries permitted until 2026-04-28 19:40:59.440806896 +0000 UTC m=+1486.192356403 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls") pod "error-404-isvc-0ca59-predictor-546755f487-skcxd" (UID: "f95b93a4-6369-4e6c-a06a-60d166c6ce2a") : secret "error-404-isvc-0ca59-predictor-serving-cert" not found Apr 28 19:40:58.607285 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:58.607254 2570 generic.go:358] "Generic (PLEG): container finished" podID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerID="c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06" exitCode=2 Apr 28 19:40:58.607445 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:58.607317 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" event={"ID":"88dd0647-a02e-49f5-8d04-2a2022ae3095","Type":"ContainerDied","Data":"c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06"} Apr 28 19:40:59.451121 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:59.451084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:59.453532 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:59.453505 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls\") pod \"error-404-isvc-0ca59-predictor-546755f487-skcxd\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:59.582503 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:59.582463 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:40:59.706759 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:40:59.706733 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd"] Apr 28 19:40:59.708364 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:40:59.708337 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf95b93a4_6369_4e6c_a06a_60d166c6ce2a.slice/crio-191867f6898fe26f550e4eeb009001e8aa194607b53da055a5279e216ed34b93 WatchSource:0}: Error finding container 191867f6898fe26f550e4eeb009001e8aa194607b53da055a5279e216ed34b93: Status 404 returned error can't find the container with id 191867f6898fe26f550e4eeb009001e8aa194607b53da055a5279e216ed34b93 Apr 28 19:41:00.361791 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:00.361741 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 28 19:41:00.616436 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:00.616352 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" event={"ID":"f95b93a4-6369-4e6c-a06a-60d166c6ce2a","Type":"ContainerStarted","Data":"369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871"} Apr 28 19:41:00.616436 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:00.616392 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" event={"ID":"f95b93a4-6369-4e6c-a06a-60d166c6ce2a","Type":"ContainerStarted","Data":"57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac"} Apr 28 19:41:00.616436 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:00.616409 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" event={"ID":"f95b93a4-6369-4e6c-a06a-60d166c6ce2a","Type":"ContainerStarted","Data":"191867f6898fe26f550e4eeb009001e8aa194607b53da055a5279e216ed34b93"} Apr 28 19:41:00.616876 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:00.616499 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:41:00.634240 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:00.634183 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" podStartSLOduration=3.634167141 podStartE2EDuration="3.634167141s" podCreationTimestamp="2026-04-28 19:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:41:00.633544062 +0000 UTC m=+1487.385093612" watchObservedRunningTime="2026-04-28 19:41:00.634167141 +0000 UTC m=+1487.385716670" Apr 28 19:41:00.950701 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:00.950678 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:41:01.065084 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.065046 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88dd0647-a02e-49f5-8d04-2a2022ae3095-error-404-isvc-2a210-kube-rbac-proxy-sar-config\") pod \"88dd0647-a02e-49f5-8d04-2a2022ae3095\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " Apr 28 19:41:01.065258 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.065107 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwgmr\" (UniqueName: \"kubernetes.io/projected/88dd0647-a02e-49f5-8d04-2a2022ae3095-kube-api-access-vwgmr\") pod \"88dd0647-a02e-49f5-8d04-2a2022ae3095\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " Apr 28 19:41:01.065258 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.065139 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88dd0647-a02e-49f5-8d04-2a2022ae3095-proxy-tls\") pod \"88dd0647-a02e-49f5-8d04-2a2022ae3095\" (UID: \"88dd0647-a02e-49f5-8d04-2a2022ae3095\") " Apr 28 19:41:01.065442 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.065415 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88dd0647-a02e-49f5-8d04-2a2022ae3095-error-404-isvc-2a210-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-2a210-kube-rbac-proxy-sar-config") pod "88dd0647-a02e-49f5-8d04-2a2022ae3095" (UID: "88dd0647-a02e-49f5-8d04-2a2022ae3095"). InnerVolumeSpecName "error-404-isvc-2a210-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:41:01.067246 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.067215 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88dd0647-a02e-49f5-8d04-2a2022ae3095-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "88dd0647-a02e-49f5-8d04-2a2022ae3095" (UID: "88dd0647-a02e-49f5-8d04-2a2022ae3095"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:41:01.067358 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.067336 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88dd0647-a02e-49f5-8d04-2a2022ae3095-kube-api-access-vwgmr" (OuterVolumeSpecName: "kube-api-access-vwgmr") pod "88dd0647-a02e-49f5-8d04-2a2022ae3095" (UID: "88dd0647-a02e-49f5-8d04-2a2022ae3095"). InnerVolumeSpecName "kube-api-access-vwgmr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:41:01.166156 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.166067 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/88dd0647-a02e-49f5-8d04-2a2022ae3095-error-404-isvc-2a210-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:41:01.166156 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.166099 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwgmr\" (UniqueName: \"kubernetes.io/projected/88dd0647-a02e-49f5-8d04-2a2022ae3095-kube-api-access-vwgmr\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:41:01.166156 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.166110 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/88dd0647-a02e-49f5-8d04-2a2022ae3095-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:41:01.620836 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.620798 2570 generic.go:358] "Generic (PLEG): container finished" podID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerID="653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2" exitCode=0 Apr 28 19:41:01.621255 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.620878 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" Apr 28 19:41:01.621255 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.620877 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" event={"ID":"88dd0647-a02e-49f5-8d04-2a2022ae3095","Type":"ContainerDied","Data":"653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2"} Apr 28 19:41:01.621255 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.620918 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt" event={"ID":"88dd0647-a02e-49f5-8d04-2a2022ae3095","Type":"ContainerDied","Data":"8ea3aa85cd70e9b1634ae71334d48aba110aa348f296a5d577c7895a6fde1855"} Apr 28 19:41:01.621255 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.620936 2570 scope.go:117] "RemoveContainer" containerID="c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06" Apr 28 19:41:01.621450 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.621433 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:41:01.622979 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.622950 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 28 19:41:01.630002 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.629987 2570 scope.go:117] "RemoveContainer" containerID="653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2" Apr 28 19:41:01.639134 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.639115 2570 scope.go:117] "RemoveContainer" containerID="c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06" Apr 28 19:41:01.639556 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:41:01.639530 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06\": container with ID starting with c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06 not found: ID does not exist" containerID="c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06" Apr 28 19:41:01.639641 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.639566 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06"} err="failed to get container status \"c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06\": rpc error: code = NotFound desc = could not find container \"c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06\": container with ID starting with c34f3fdbacfcd91f8820202c340e58e36a4750baba17949ee7fb33ff47a1de06 not found: ID does not exist" Apr 28 19:41:01.639641 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.639583 2570 scope.go:117] "RemoveContainer" containerID="653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2" Apr 28 19:41:01.639876 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:41:01.639855 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2\": container with ID starting with 653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2 not found: ID does not exist" containerID="653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2" Apr 28 19:41:01.639928 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.639885 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2"} err="failed to get container status \"653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2\": rpc error: code = NotFound desc = could not find container \"653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2\": container with ID starting with 653b1060446934089aa7a37afd6cdaa4073a978e4460aabd1d05cbb2cadb43f2 not found: ID does not exist" Apr 28 19:41:01.642955 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.642928 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt"] Apr 28 19:41:01.644494 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.644468 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2a210-predictor-7bb85759bc-hqfwt"] Apr 28 19:41:01.887514 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:01.887436 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" path="/var/lib/kubelet/pods/88dd0647-a02e-49f5-8d04-2a2022ae3095/volumes" Apr 28 19:41:02.625727 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:02.625684 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 28 19:41:05.514731 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:05.514692 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 28 19:41:07.629841 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:07.629799 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:41:07.630355 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:07.630325 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 28 19:41:13.902100 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:13.902051 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:41:13.905309 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:13.905286 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:41:15.516161 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:15.516125 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:41:17.631094 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:17.631057 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 28 19:41:27.630381 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:27.630335 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 28 19:41:37.530977 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.530896 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs"] Apr 28 19:41:37.531487 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.531178 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" containerID="cri-o://aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622" gracePeriod=30 Apr 28 19:41:37.531487 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.531215 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kube-rbac-proxy" containerID="cri-o://7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0" gracePeriod=30 Apr 28 19:41:37.602252 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.602218 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj"] Apr 28 19:41:37.602598 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.602585 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kube-rbac-proxy" Apr 28 19:41:37.602716 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.602615 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kube-rbac-proxy" Apr 28 19:41:37.602716 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.602632 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" Apr 28 19:41:37.602716 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.602638 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" Apr 28 19:41:37.602716 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.602694 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kserve-container" Apr 28 19:41:37.602716 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.602712 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="88dd0647-a02e-49f5-8d04-2a2022ae3095" containerName="kube-rbac-proxy" Apr 28 19:41:37.607267 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.607248 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.609814 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.609790 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7809f-predictor-serving-cert\"" Apr 28 19:41:37.609918 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.609829 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7809f-kube-rbac-proxy-sar-config\"" Apr 28 19:41:37.615902 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.615881 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj"] Apr 28 19:41:37.630374 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.630333 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 28 19:41:37.669537 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.669500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24dd9\" (UniqueName: \"kubernetes.io/projected/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-kube-api-access-24dd9\") pod \"error-404-isvc-7809f-predictor-858cb9b479-84qdj\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.669702 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.669546 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-proxy-tls\") pod \"error-404-isvc-7809f-predictor-858cb9b479-84qdj\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.669702 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.669592 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-error-404-isvc-7809f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7809f-predictor-858cb9b479-84qdj\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.743747 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.743713 2570 generic.go:358] "Generic (PLEG): container finished" podID="4c09e77c-dde3-4787-8002-d63f394b155c" containerID="7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0" exitCode=2 Apr 28 19:41:37.743889 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.743782 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" event={"ID":"4c09e77c-dde3-4787-8002-d63f394b155c","Type":"ContainerDied","Data":"7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0"} Apr 28 19:41:37.770467 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.770434 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24dd9\" (UniqueName: \"kubernetes.io/projected/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-kube-api-access-24dd9\") pod \"error-404-isvc-7809f-predictor-858cb9b479-84qdj\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.770686 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.770479 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-proxy-tls\") pod \"error-404-isvc-7809f-predictor-858cb9b479-84qdj\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.770686 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.770503 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-error-404-isvc-7809f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7809f-predictor-858cb9b479-84qdj\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.771132 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.771109 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-error-404-isvc-7809f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7809f-predictor-858cb9b479-84qdj\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.772909 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.772885 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-proxy-tls\") pod \"error-404-isvc-7809f-predictor-858cb9b479-84qdj\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.778837 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.778814 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24dd9\" (UniqueName: \"kubernetes.io/projected/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-kube-api-access-24dd9\") pod \"error-404-isvc-7809f-predictor-858cb9b479-84qdj\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:37.918491 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:37.918406 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:38.044940 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:38.044908 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj"] Apr 28 19:41:38.047529 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:41:38.047500 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf672cba_15ab_4de9_a2ce_8605fce8f1ec.slice/crio-06e26d2745ecb623418f48a2963b86852b861df715fcca40872203690437451c WatchSource:0}: Error finding container 06e26d2745ecb623418f48a2963b86852b861df715fcca40872203690437451c: Status 404 returned error can't find the container with id 06e26d2745ecb623418f48a2963b86852b861df715fcca40872203690437451c Apr 28 19:41:38.748836 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:38.748797 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" event={"ID":"bf672cba-15ab-4de9-a2ce-8605fce8f1ec","Type":"ContainerStarted","Data":"0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb"} Apr 28 19:41:38.748836 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:38.748840 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" event={"ID":"bf672cba-15ab-4de9-a2ce-8605fce8f1ec","Type":"ContainerStarted","Data":"c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab"} Apr 28 19:41:38.749235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:38.748855 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" event={"ID":"bf672cba-15ab-4de9-a2ce-8605fce8f1ec","Type":"ContainerStarted","Data":"06e26d2745ecb623418f48a2963b86852b861df715fcca40872203690437451c"} Apr 28 19:41:38.749235 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:38.748927 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:38.767218 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:38.767173 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podStartSLOduration=1.767158744 podStartE2EDuration="1.767158744s" podCreationTimestamp="2026-04-28 19:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:41:38.765918601 +0000 UTC m=+1525.517468119" watchObservedRunningTime="2026-04-28 19:41:38.767158744 +0000 UTC m=+1525.518708256" Apr 28 19:41:39.751953 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:39.751922 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:39.753224 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:39.753194 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 28 19:41:40.509658 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:40.509588 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 28 19:41:40.755624 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:40.755574 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 28 19:41:40.980480 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:40.980456 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:41:41.100088 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.099998 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c09e77c-dde3-4787-8002-d63f394b155c-error-404-isvc-26ebe-kube-rbac-proxy-sar-config\") pod \"4c09e77c-dde3-4787-8002-d63f394b155c\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " Apr 28 19:41:41.100088 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.100066 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c09e77c-dde3-4787-8002-d63f394b155c-proxy-tls\") pod \"4c09e77c-dde3-4787-8002-d63f394b155c\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " Apr 28 19:41:41.100323 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.100109 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7szg6\" (UniqueName: \"kubernetes.io/projected/4c09e77c-dde3-4787-8002-d63f394b155c-kube-api-access-7szg6\") pod \"4c09e77c-dde3-4787-8002-d63f394b155c\" (UID: \"4c09e77c-dde3-4787-8002-d63f394b155c\") " Apr 28 19:41:41.100475 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.100445 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c09e77c-dde3-4787-8002-d63f394b155c-error-404-isvc-26ebe-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-26ebe-kube-rbac-proxy-sar-config") pod "4c09e77c-dde3-4787-8002-d63f394b155c" (UID: "4c09e77c-dde3-4787-8002-d63f394b155c"). InnerVolumeSpecName "error-404-isvc-26ebe-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:41:41.102324 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.102294 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c09e77c-dde3-4787-8002-d63f394b155c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4c09e77c-dde3-4787-8002-d63f394b155c" (UID: "4c09e77c-dde3-4787-8002-d63f394b155c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:41:41.102424 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.102312 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c09e77c-dde3-4787-8002-d63f394b155c-kube-api-access-7szg6" (OuterVolumeSpecName: "kube-api-access-7szg6") pod "4c09e77c-dde3-4787-8002-d63f394b155c" (UID: "4c09e77c-dde3-4787-8002-d63f394b155c"). InnerVolumeSpecName "kube-api-access-7szg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:41:41.200817 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.200785 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c09e77c-dde3-4787-8002-d63f394b155c-error-404-isvc-26ebe-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:41:41.200817 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.200812 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c09e77c-dde3-4787-8002-d63f394b155c-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:41:41.200817 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.200824 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7szg6\" (UniqueName: \"kubernetes.io/projected/4c09e77c-dde3-4787-8002-d63f394b155c-kube-api-access-7szg6\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:41:41.760192 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.760158 2570 generic.go:358] "Generic (PLEG): container finished" podID="4c09e77c-dde3-4787-8002-d63f394b155c" containerID="aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622" exitCode=0 Apr 28 19:41:41.760595 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.760209 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" event={"ID":"4c09e77c-dde3-4787-8002-d63f394b155c","Type":"ContainerDied","Data":"aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622"} Apr 28 19:41:41.760595 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.760232 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" event={"ID":"4c09e77c-dde3-4787-8002-d63f394b155c","Type":"ContainerDied","Data":"86c0bc5e679df0683abbf3f7edaa74bd6ca5a22a4218ab25942a280950049ead"} Apr 28 19:41:41.760595 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.760232 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs" Apr 28 19:41:41.760595 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.760308 2570 scope.go:117] "RemoveContainer" containerID="7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0" Apr 28 19:41:41.768990 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.768968 2570 scope.go:117] "RemoveContainer" containerID="aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622" Apr 28 19:41:41.775820 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.775800 2570 scope.go:117] "RemoveContainer" containerID="7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0" Apr 28 19:41:41.776065 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:41:41.776047 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0\": container with ID starting with 7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0 not found: ID does not exist" containerID="7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0" Apr 28 19:41:41.776121 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.776076 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0"} err="failed to get container status \"7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0\": rpc error: code = NotFound desc = could not find container \"7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0\": container with ID starting with 7b3c3ad5e1e9b42e7e2e0843a38efbcf93bcd07d407e714381d8cd5d5bb191a0 not found: ID does not exist" Apr 28 19:41:41.776121 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.776092 2570 scope.go:117] "RemoveContainer" containerID="aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622" Apr 28 19:41:41.776320 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:41:41.776299 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622\": container with ID starting with aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622 not found: ID does not exist" containerID="aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622" Apr 28 19:41:41.776394 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.776324 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622"} err="failed to get container status \"aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622\": rpc error: code = NotFound desc = could not find container \"aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622\": container with ID starting with aa1f923718a3c663fa3cfa4a1712a499e10581378e4b070ae75e16dbb7c5b622 not found: ID does not exist" Apr 28 19:41:41.781339 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.781319 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs"] Apr 28 19:41:41.787086 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.787064 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-26ebe-predictor-7d49cd9f75-qjrrs"] Apr 28 19:41:41.887727 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:41.887698 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" path="/var/lib/kubelet/pods/4c09e77c-dde3-4787-8002-d63f394b155c/volumes" Apr 28 19:41:45.760148 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:45.760115 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:41:45.760693 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:45.760665 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 28 19:41:47.630753 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:47.630723 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:41:55.761616 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:41:55.761557 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 28 19:42:05.760919 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:42:05.760878 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 28 19:42:15.761021 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:42:15.760983 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 28 19:42:25.761493 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:42:25.761464 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:46:13.925192 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:46:13.925163 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:46:13.930446 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:46:13.930425 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:50:12.684328 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.684297 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd"] Apr 28 19:50:12.685043 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.684572 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" containerID="cri-o://57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac" gracePeriod=30 Apr 28 19:50:12.685043 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.684631 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kube-rbac-proxy" containerID="cri-o://369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871" gracePeriod=30 Apr 28 19:50:12.775290 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.775245 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7"] Apr 28 19:50:12.775690 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.775673 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" Apr 28 19:50:12.775783 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.775692 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" Apr 28 19:50:12.775783 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.775709 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kube-rbac-proxy" Apr 28 19:50:12.775783 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.775716 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kube-rbac-proxy" Apr 28 19:50:12.775944 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.775824 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kube-rbac-proxy" Apr 28 19:50:12.775944 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.775839 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c09e77c-dde3-4787-8002-d63f394b155c" containerName="kserve-container" Apr 28 19:50:12.779025 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.779002 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:12.781280 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.781259 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6bad9-predictor-serving-cert\"" Apr 28 19:50:12.781362 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.781310 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6bad9-kube-rbac-proxy-sar-config\"" Apr 28 19:50:12.788748 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.788720 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7"] Apr 28 19:50:12.877674 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.877631 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62e4801-d6f5-4881-b82f-c90cae86b77c-error-404-isvc-6bad9-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:12.877858 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.877744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62e4801-d6f5-4881-b82f-c90cae86b77c-proxy-tls\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:12.877858 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.877773 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdb8\" (UniqueName: \"kubernetes.io/projected/f62e4801-d6f5-4881-b82f-c90cae86b77c-kube-api-access-jqdb8\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:12.978502 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.978409 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62e4801-d6f5-4881-b82f-c90cae86b77c-proxy-tls\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:12.978502 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.978471 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdb8\" (UniqueName: \"kubernetes.io/projected/f62e4801-d6f5-4881-b82f-c90cae86b77c-kube-api-access-jqdb8\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:12.978809 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:50:12.978576 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-serving-cert: secret "error-404-isvc-6bad9-predictor-serving-cert" not found Apr 28 19:50:12.978809 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.978647 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62e4801-d6f5-4881-b82f-c90cae86b77c-error-404-isvc-6bad9-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:12.978809 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:50:12.978687 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f62e4801-d6f5-4881-b82f-c90cae86b77c-proxy-tls podName:f62e4801-d6f5-4881-b82f-c90cae86b77c nodeName:}" failed. No retries permitted until 2026-04-28 19:50:13.478662691 +0000 UTC m=+2040.230212210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f62e4801-d6f5-4881-b82f-c90cae86b77c-proxy-tls") pod "error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" (UID: "f62e4801-d6f5-4881-b82f-c90cae86b77c") : secret "error-404-isvc-6bad9-predictor-serving-cert" not found Apr 28 19:50:12.979300 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.979278 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62e4801-d6f5-4881-b82f-c90cae86b77c-error-404-isvc-6bad9-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:12.987869 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:12.987844 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdb8\" (UniqueName: \"kubernetes.io/projected/f62e4801-d6f5-4881-b82f-c90cae86b77c-kube-api-access-jqdb8\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:13.483680 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:13.483640 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62e4801-d6f5-4881-b82f-c90cae86b77c-proxy-tls\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:13.486257 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:13.486231 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62e4801-d6f5-4881-b82f-c90cae86b77c-proxy-tls\") pod \"error-404-isvc-6bad9-predictor-6c4b946b59-klxc7\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:13.490588 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:13.490553 2570 generic.go:358] "Generic (PLEG): container finished" podID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerID="369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871" exitCode=2 Apr 28 19:50:13.490752 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:13.490627 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" event={"ID":"f95b93a4-6369-4e6c-a06a-60d166c6ce2a","Type":"ContainerDied","Data":"369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871"} Apr 28 19:50:13.690484 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:13.690451 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:13.815525 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:13.815471 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7"] Apr 28 19:50:13.818018 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:50:13.817988 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf62e4801_d6f5_4881_b82f_c90cae86b77c.slice/crio-a80330ce4c86c6dcc8877f51d324f72b31358dce87124f0a8ee65707bcd9fdb2 WatchSource:0}: Error finding container a80330ce4c86c6dcc8877f51d324f72b31358dce87124f0a8ee65707bcd9fdb2: Status 404 returned error can't find the container with id a80330ce4c86c6dcc8877f51d324f72b31358dce87124f0a8ee65707bcd9fdb2 Apr 28 19:50:13.819781 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:13.819761 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:50:14.496525 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:14.496485 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" event={"ID":"f62e4801-d6f5-4881-b82f-c90cae86b77c","Type":"ContainerStarted","Data":"64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a"} Apr 28 19:50:14.496772 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:14.496535 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:14.496772 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:14.496556 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" event={"ID":"f62e4801-d6f5-4881-b82f-c90cae86b77c","Type":"ContainerStarted","Data":"6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911"} Apr 28 19:50:14.496772 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:14.496570 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" event={"ID":"f62e4801-d6f5-4881-b82f-c90cae86b77c","Type":"ContainerStarted","Data":"a80330ce4c86c6dcc8877f51d324f72b31358dce87124f0a8ee65707bcd9fdb2"} Apr 28 19:50:14.496772 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:14.496584 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:14.498102 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:14.498073 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 28 19:50:14.538049 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:14.537988 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podStartSLOduration=2.537971189 podStartE2EDuration="2.537971189s" podCreationTimestamp="2026-04-28 19:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:50:14.535640244 +0000 UTC m=+2041.287189758" watchObservedRunningTime="2026-04-28 19:50:14.537971189 +0000 UTC m=+2041.289520748" Apr 28 19:50:15.500461 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:15.500420 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 28 19:50:16.033417 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.033387 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:50:16.108042 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.107943 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-error-404-isvc-0ca59-kube-rbac-proxy-sar-config\") pod \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " Apr 28 19:50:16.108042 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.108007 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls\") pod \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " Apr 28 19:50:16.108042 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.108035 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7nlv\" (UniqueName: \"kubernetes.io/projected/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-kube-api-access-t7nlv\") pod \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\" (UID: \"f95b93a4-6369-4e6c-a06a-60d166c6ce2a\") " Apr 28 19:50:16.108447 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.108420 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-error-404-isvc-0ca59-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-0ca59-kube-rbac-proxy-sar-config") pod "f95b93a4-6369-4e6c-a06a-60d166c6ce2a" (UID: "f95b93a4-6369-4e6c-a06a-60d166c6ce2a"). InnerVolumeSpecName "error-404-isvc-0ca59-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:50:16.110196 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.110162 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f95b93a4-6369-4e6c-a06a-60d166c6ce2a" (UID: "f95b93a4-6369-4e6c-a06a-60d166c6ce2a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:50:16.110196 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.110171 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-kube-api-access-t7nlv" (OuterVolumeSpecName: "kube-api-access-t7nlv") pod "f95b93a4-6369-4e6c-a06a-60d166c6ce2a" (UID: "f95b93a4-6369-4e6c-a06a-60d166c6ce2a"). InnerVolumeSpecName "kube-api-access-t7nlv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:50:16.208743 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.208702 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-error-404-isvc-0ca59-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.208743 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.208735 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.208743 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.208746 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t7nlv\" (UniqueName: \"kubernetes.io/projected/f95b93a4-6369-4e6c-a06a-60d166c6ce2a-kube-api-access-t7nlv\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.506035 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.506002 2570 generic.go:358] "Generic (PLEG): container finished" podID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerID="57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac" exitCode=0 Apr 28 19:50:16.506487 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.506047 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" event={"ID":"f95b93a4-6369-4e6c-a06a-60d166c6ce2a","Type":"ContainerDied","Data":"57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac"} Apr 28 19:50:16.506487 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.506080 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" event={"ID":"f95b93a4-6369-4e6c-a06a-60d166c6ce2a","Type":"ContainerDied","Data":"191867f6898fe26f550e4eeb009001e8aa194607b53da055a5279e216ed34b93"} Apr 28 19:50:16.506487 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.506080 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd" Apr 28 19:50:16.506487 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.506146 2570 scope.go:117] "RemoveContainer" containerID="369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871" Apr 28 19:50:16.514535 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.514508 2570 scope.go:117] "RemoveContainer" containerID="57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac" Apr 28 19:50:16.522086 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.522066 2570 scope.go:117] "RemoveContainer" containerID="369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871" Apr 28 19:50:16.522364 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:50:16.522344 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871\": container with ID starting with 369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871 not found: ID does not exist" containerID="369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871" Apr 28 19:50:16.522439 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.522377 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871"} err="failed to get container status \"369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871\": rpc error: code = NotFound desc = could not find container \"369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871\": container with ID starting with 369e13e040b5ad725136879f3a65bb4be19a9f462648e904ec45640f67181871 not found: ID does not exist" Apr 28 19:50:16.522439 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.522404 2570 scope.go:117] "RemoveContainer" containerID="57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac" Apr 28 19:50:16.522669 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:50:16.522652 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac\": container with ID starting with 57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac not found: ID does not exist" containerID="57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac" Apr 28 19:50:16.522717 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.522676 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac"} err="failed to get container status \"57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac\": rpc error: code = NotFound desc = could not find container \"57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac\": container with ID starting with 57412f19e48ba0cae104e45c09e211feb4fb8d43f3c78f887281d51f5357f0ac not found: ID does not exist" Apr 28 19:50:16.527083 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.527058 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd"] Apr 28 19:50:16.530435 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:16.530412 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0ca59-predictor-546755f487-skcxd"] Apr 28 19:50:17.887498 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:17.887455 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" path="/var/lib/kubelet/pods/f95b93a4-6369-4e6c-a06a-60d166c6ce2a/volumes" Apr 28 19:50:20.505498 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:20.505464 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:50:20.506027 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:20.505999 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 28 19:50:30.506689 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:30.506648 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 28 19:50:40.506562 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:40.506474 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 28 19:50:50.506290 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:50.506242 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 28 19:50:52.473323 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.473291 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj"] Apr 28 19:50:52.473758 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.473691 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" containerID="cri-o://c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab" gracePeriod=30 Apr 28 19:50:52.473758 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.473721 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kube-rbac-proxy" containerID="cri-o://0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb" gracePeriod=30 Apr 28 19:50:52.523950 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.523917 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7"] Apr 28 19:50:52.524260 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.524249 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" Apr 28 19:50:52.524303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.524262 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" Apr 28 19:50:52.524303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.524281 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kube-rbac-proxy" Apr 28 19:50:52.524303 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.524286 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kube-rbac-proxy" Apr 28 19:50:52.524394 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.524342 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kserve-container" Apr 28 19:50:52.524394 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.524354 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f95b93a4-6369-4e6c-a06a-60d166c6ce2a" containerName="kube-rbac-proxy" Apr 28 19:50:52.527485 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.527459 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.529785 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.529764 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e5e5a-predictor-serving-cert\"" Apr 28 19:50:52.529895 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.529803 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\"" Apr 28 19:50:52.538490 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.538457 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7"] Apr 28 19:50:52.615273 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.615236 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09058041-a3b9-4649-94c5-2f2bbc9e451c-error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.615441 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.615283 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09058041-a3b9-4649-94c5-2f2bbc9e451c-proxy-tls\") pod \"error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.615441 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.615360 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzqlk\" (UniqueName: \"kubernetes.io/projected/09058041-a3b9-4649-94c5-2f2bbc9e451c-kube-api-access-hzqlk\") pod \"error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.631392 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.631361 2570 generic.go:358] "Generic (PLEG): container finished" podID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerID="0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb" exitCode=2 Apr 28 19:50:52.631526 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.631423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" event={"ID":"bf672cba-15ab-4de9-a2ce-8605fce8f1ec","Type":"ContainerDied","Data":"0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb"} Apr 28 19:50:52.716352 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.716321 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09058041-a3b9-4649-94c5-2f2bbc9e451c-error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.716525 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.716359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09058041-a3b9-4649-94c5-2f2bbc9e451c-proxy-tls\") pod \"error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.716525 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.716392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzqlk\" (UniqueName: \"kubernetes.io/projected/09058041-a3b9-4649-94c5-2f2bbc9e451c-kube-api-access-hzqlk\") pod \"error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.717091 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.717061 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09058041-a3b9-4649-94c5-2f2bbc9e451c-error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.719028 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.719003 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09058041-a3b9-4649-94c5-2f2bbc9e451c-proxy-tls\") pod \"error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.724287 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.724241 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzqlk\" (UniqueName: \"kubernetes.io/projected/09058041-a3b9-4649-94c5-2f2bbc9e451c-kube-api-access-hzqlk\") pod \"error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.839339 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.839301 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:52.961711 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:52.961683 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7"] Apr 28 19:50:52.963242 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:50:52.963215 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09058041_a3b9_4649_94c5_2f2bbc9e451c.slice/crio-ea9e6fc1d8ae93347cbdb9206d64ec497ad02b441cce85a5a09de444b0506514 WatchSource:0}: Error finding container ea9e6fc1d8ae93347cbdb9206d64ec497ad02b441cce85a5a09de444b0506514: Status 404 returned error can't find the container with id ea9e6fc1d8ae93347cbdb9206d64ec497ad02b441cce85a5a09de444b0506514 Apr 28 19:50:53.637974 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:53.637938 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" event={"ID":"09058041-a3b9-4649-94c5-2f2bbc9e451c","Type":"ContainerStarted","Data":"de117244b34d729696f5186157f7a21f95177be1fd8070b9510513387b5d5517"} Apr 28 19:50:53.637974 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:53.637975 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" event={"ID":"09058041-a3b9-4649-94c5-2f2bbc9e451c","Type":"ContainerStarted","Data":"005d6b1921492f5b5ad78674ef49064e77a8a54031f9e9572846c4b1a5685800"} Apr 28 19:50:53.638416 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:53.637987 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" event={"ID":"09058041-a3b9-4649-94c5-2f2bbc9e451c","Type":"ContainerStarted","Data":"ea9e6fc1d8ae93347cbdb9206d64ec497ad02b441cce85a5a09de444b0506514"} Apr 28 19:50:53.638416 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:53.638074 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:53.656714 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:53.656657 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" podStartSLOduration=1.656641768 podStartE2EDuration="1.656641768s" podCreationTimestamp="2026-04-28 19:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:50:53.654281603 +0000 UTC m=+2080.405831131" watchObservedRunningTime="2026-04-28 19:50:53.656641768 +0000 UTC m=+2080.408191282" Apr 28 19:50:54.641199 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:54.641163 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:50:54.642422 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:54.642397 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 28 19:50:55.644862 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:55.644816 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 28 19:50:55.756479 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:55.756438 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 28 19:50:55.760890 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:55.760858 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 28 19:50:56.017280 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.017253 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:50:56.145842 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.145804 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-error-404-isvc-7809f-kube-rbac-proxy-sar-config\") pod \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " Apr 28 19:50:56.146048 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.145942 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24dd9\" (UniqueName: \"kubernetes.io/projected/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-kube-api-access-24dd9\") pod \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " Apr 28 19:50:56.146048 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.145981 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-proxy-tls\") pod \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\" (UID: \"bf672cba-15ab-4de9-a2ce-8605fce8f1ec\") " Apr 28 19:50:56.146242 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.146220 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-error-404-isvc-7809f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-7809f-kube-rbac-proxy-sar-config") pod "bf672cba-15ab-4de9-a2ce-8605fce8f1ec" (UID: "bf672cba-15ab-4de9-a2ce-8605fce8f1ec"). InnerVolumeSpecName "error-404-isvc-7809f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:50:56.148065 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.148039 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-kube-api-access-24dd9" (OuterVolumeSpecName: "kube-api-access-24dd9") pod "bf672cba-15ab-4de9-a2ce-8605fce8f1ec" (UID: "bf672cba-15ab-4de9-a2ce-8605fce8f1ec"). InnerVolumeSpecName "kube-api-access-24dd9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:50:56.148151 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.148083 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bf672cba-15ab-4de9-a2ce-8605fce8f1ec" (UID: "bf672cba-15ab-4de9-a2ce-8605fce8f1ec"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:50:56.246854 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.246764 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-24dd9\" (UniqueName: \"kubernetes.io/projected/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-kube-api-access-24dd9\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:50:56.246854 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.246798 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:50:56.246854 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.246813 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf672cba-15ab-4de9-a2ce-8605fce8f1ec-error-404-isvc-7809f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:50:56.649504 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.649468 2570 generic.go:358] "Generic (PLEG): container finished" podID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerID="c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab" exitCode=0 Apr 28 19:50:56.649949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.649549 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" Apr 28 19:50:56.649949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.649547 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" event={"ID":"bf672cba-15ab-4de9-a2ce-8605fce8f1ec","Type":"ContainerDied","Data":"c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab"} Apr 28 19:50:56.649949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.649662 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj" event={"ID":"bf672cba-15ab-4de9-a2ce-8605fce8f1ec","Type":"ContainerDied","Data":"06e26d2745ecb623418f48a2963b86852b861df715fcca40872203690437451c"} Apr 28 19:50:56.649949 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.649679 2570 scope.go:117] "RemoveContainer" containerID="0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb" Apr 28 19:50:56.658270 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.658250 2570 scope.go:117] "RemoveContainer" containerID="c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab" Apr 28 19:50:56.665207 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.665184 2570 scope.go:117] "RemoveContainer" containerID="0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb" Apr 28 19:50:56.665455 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:50:56.665435 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb\": container with ID starting with 0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb not found: ID does not exist" containerID="0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb" Apr 28 19:50:56.665513 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.665464 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb"} err="failed to get container status \"0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb\": rpc error: code = NotFound desc = could not find container \"0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb\": container with ID starting with 0692da9ae1086915455786a99ad489b6783be3f227c76c3fd84ace5e05fb36cb not found: ID does not exist" Apr 28 19:50:56.665513 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.665482 2570 scope.go:117] "RemoveContainer" containerID="c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab" Apr 28 19:50:56.665717 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:50:56.665697 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab\": container with ID starting with c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab not found: ID does not exist" containerID="c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab" Apr 28 19:50:56.665767 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.665724 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab"} err="failed to get container status \"c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab\": rpc error: code = NotFound desc = could not find container \"c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab\": container with ID starting with c2737af5048a41f989f7876e552c41c63dda4bc52c8160b8a091e79e6c0947ab not found: ID does not exist" Apr 28 19:50:56.669986 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.669965 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj"] Apr 28 19:50:56.674141 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:56.674121 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7809f-predictor-858cb9b479-84qdj"] Apr 28 19:50:57.887468 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:50:57.887437 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" path="/var/lib/kubelet/pods/bf672cba-15ab-4de9-a2ce-8605fce8f1ec/volumes" Apr 28 19:51:00.507214 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:00.507183 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:51:00.649084 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:00.649056 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:51:00.649524 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:00.649501 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 28 19:51:10.650095 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:10.650054 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 28 19:51:13.951010 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:13.950987 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:51:13.956789 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:13.956769 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:51:20.649620 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:20.649576 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 28 19:51:23.011709 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.011671 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7"] Apr 28 19:51:23.012104 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.011974 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" containerID="cri-o://6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911" gracePeriod=30 Apr 28 19:51:23.012104 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.012046 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kube-rbac-proxy" containerID="cri-o://64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a" gracePeriod=30 Apr 28 19:51:23.072079 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.072042 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr"] Apr 28 19:51:23.072419 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.072404 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kube-rbac-proxy" Apr 28 19:51:23.072419 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.072418 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kube-rbac-proxy" Apr 28 19:51:23.072592 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.072437 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" Apr 28 19:51:23.072592 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.072443 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" Apr 28 19:51:23.072592 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.072522 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kube-rbac-proxy" Apr 28 19:51:23.072592 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.072533 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf672cba-15ab-4de9-a2ce-8605fce8f1ec" containerName="kserve-container" Apr 28 19:51:23.075518 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.075499 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.077741 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.077714 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-93cee-kube-rbac-proxy-sar-config\"" Apr 28 19:51:23.077855 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.077810 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-93cee-predictor-serving-cert\"" Apr 28 19:51:23.083635 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.083595 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr"] Apr 28 19:51:23.186904 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.186867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f19c733-98a3-4e90-8c29-4bd9394f8d68-error-404-isvc-93cee-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.187068 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.187008 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54nf\" (UniqueName: \"kubernetes.io/projected/7f19c733-98a3-4e90-8c29-4bd9394f8d68-kube-api-access-j54nf\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.187068 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.187045 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f19c733-98a3-4e90-8c29-4bd9394f8d68-proxy-tls\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.288419 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.288323 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j54nf\" (UniqueName: \"kubernetes.io/projected/7f19c733-98a3-4e90-8c29-4bd9394f8d68-kube-api-access-j54nf\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.288419 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.288369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f19c733-98a3-4e90-8c29-4bd9394f8d68-proxy-tls\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.288705 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:51:23.288538 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-93cee-predictor-serving-cert: secret "error-404-isvc-93cee-predictor-serving-cert" not found Apr 28 19:51:23.288705 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:51:23.288656 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f19c733-98a3-4e90-8c29-4bd9394f8d68-proxy-tls podName:7f19c733-98a3-4e90-8c29-4bd9394f8d68 nodeName:}" failed. No retries permitted until 2026-04-28 19:51:23.788598533 +0000 UTC m=+2110.540148026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7f19c733-98a3-4e90-8c29-4bd9394f8d68-proxy-tls") pod "error-404-isvc-93cee-predictor-787774f689-d62cr" (UID: "7f19c733-98a3-4e90-8c29-4bd9394f8d68") : secret "error-404-isvc-93cee-predictor-serving-cert" not found Apr 28 19:51:23.288705 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.288685 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f19c733-98a3-4e90-8c29-4bd9394f8d68-error-404-isvc-93cee-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.289306 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.289283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f19c733-98a3-4e90-8c29-4bd9394f8d68-error-404-isvc-93cee-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.298672 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.298637 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54nf\" (UniqueName: \"kubernetes.io/projected/7f19c733-98a3-4e90-8c29-4bd9394f8d68-kube-api-access-j54nf\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.742350 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.742265 2570 generic.go:358] "Generic (PLEG): container finished" podID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerID="64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a" exitCode=2 Apr 28 19:51:23.742350 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.742336 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" event={"ID":"f62e4801-d6f5-4881-b82f-c90cae86b77c","Type":"ContainerDied","Data":"64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a"} Apr 28 19:51:23.794105 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.794066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f19c733-98a3-4e90-8c29-4bd9394f8d68-proxy-tls\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.796582 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.796553 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f19c733-98a3-4e90-8c29-4bd9394f8d68-proxy-tls\") pod \"error-404-isvc-93cee-predictor-787774f689-d62cr\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:23.987117 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:23.987077 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:24.115313 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:24.115284 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr"] Apr 28 19:51:24.117860 ip-10-0-138-34 kubenswrapper[2570]: W0428 19:51:24.117827 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f19c733_98a3_4e90_8c29_4bd9394f8d68.slice/crio-9a73e7464ed320ec4d1c8ae008524eed3e33c7ce4fde9dd252b47cfaad2f57c1 WatchSource:0}: Error finding container 9a73e7464ed320ec4d1c8ae008524eed3e33c7ce4fde9dd252b47cfaad2f57c1: Status 404 returned error can't find the container with id 9a73e7464ed320ec4d1c8ae008524eed3e33c7ce4fde9dd252b47cfaad2f57c1 Apr 28 19:51:24.747590 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:24.747488 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" event={"ID":"7f19c733-98a3-4e90-8c29-4bd9394f8d68","Type":"ContainerStarted","Data":"2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8"} Apr 28 19:51:24.747590 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:24.747533 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" event={"ID":"7f19c733-98a3-4e90-8c29-4bd9394f8d68","Type":"ContainerStarted","Data":"c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a"} Apr 28 19:51:24.747590 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:24.747546 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" event={"ID":"7f19c733-98a3-4e90-8c29-4bd9394f8d68","Type":"ContainerStarted","Data":"9a73e7464ed320ec4d1c8ae008524eed3e33c7ce4fde9dd252b47cfaad2f57c1"} Apr 28 19:51:24.747891 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:24.747641 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:24.768275 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:24.768222 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" podStartSLOduration=1.768206822 podStartE2EDuration="1.768206822s" podCreationTimestamp="2026-04-28 19:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:51:24.767153426 +0000 UTC m=+2111.518702935" watchObservedRunningTime="2026-04-28 19:51:24.768206822 +0000 UTC m=+2111.519756337" Apr 28 19:51:25.501727 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:25.501683 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 28 19:51:25.751344 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:25.751309 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:25.752727 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:25.752660 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 28 19:51:26.254047 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.254022 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:51:26.314099 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.314066 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqdb8\" (UniqueName: \"kubernetes.io/projected/f62e4801-d6f5-4881-b82f-c90cae86b77c-kube-api-access-jqdb8\") pod \"f62e4801-d6f5-4881-b82f-c90cae86b77c\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " Apr 28 19:51:26.314275 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.314156 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62e4801-d6f5-4881-b82f-c90cae86b77c-error-404-isvc-6bad9-kube-rbac-proxy-sar-config\") pod \"f62e4801-d6f5-4881-b82f-c90cae86b77c\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " Apr 28 19:51:26.314275 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.314190 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62e4801-d6f5-4881-b82f-c90cae86b77c-proxy-tls\") pod \"f62e4801-d6f5-4881-b82f-c90cae86b77c\" (UID: \"f62e4801-d6f5-4881-b82f-c90cae86b77c\") " Apr 28 19:51:26.314590 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.314555 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62e4801-d6f5-4881-b82f-c90cae86b77c-error-404-isvc-6bad9-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-6bad9-kube-rbac-proxy-sar-config") pod "f62e4801-d6f5-4881-b82f-c90cae86b77c" (UID: "f62e4801-d6f5-4881-b82f-c90cae86b77c"). InnerVolumeSpecName "error-404-isvc-6bad9-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:51:26.316177 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.316151 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62e4801-d6f5-4881-b82f-c90cae86b77c-kube-api-access-jqdb8" (OuterVolumeSpecName: "kube-api-access-jqdb8") pod "f62e4801-d6f5-4881-b82f-c90cae86b77c" (UID: "f62e4801-d6f5-4881-b82f-c90cae86b77c"). InnerVolumeSpecName "kube-api-access-jqdb8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:51:26.316177 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.316157 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62e4801-d6f5-4881-b82f-c90cae86b77c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f62e4801-d6f5-4881-b82f-c90cae86b77c" (UID: "f62e4801-d6f5-4881-b82f-c90cae86b77c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:51:26.415663 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.415551 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jqdb8\" (UniqueName: \"kubernetes.io/projected/f62e4801-d6f5-4881-b82f-c90cae86b77c-kube-api-access-jqdb8\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:51:26.415663 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.415581 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62e4801-d6f5-4881-b82f-c90cae86b77c-error-404-isvc-6bad9-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:51:26.415663 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.415593 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62e4801-d6f5-4881-b82f-c90cae86b77c-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 19:51:26.755452 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.755420 2570 generic.go:358] "Generic (PLEG): container finished" podID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerID="6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911" exitCode=0 Apr 28 19:51:26.755913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.755485 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" Apr 28 19:51:26.755913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.755504 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" event={"ID":"f62e4801-d6f5-4881-b82f-c90cae86b77c","Type":"ContainerDied","Data":"6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911"} Apr 28 19:51:26.755913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.755540 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7" event={"ID":"f62e4801-d6f5-4881-b82f-c90cae86b77c","Type":"ContainerDied","Data":"a80330ce4c86c6dcc8877f51d324f72b31358dce87124f0a8ee65707bcd9fdb2"} Apr 28 19:51:26.755913 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.755555 2570 scope.go:117] "RemoveContainer" containerID="64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a" Apr 28 19:51:26.756108 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.755989 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 28 19:51:26.763771 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.763725 2570 scope.go:117] "RemoveContainer" containerID="6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911" Apr 28 19:51:26.770568 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.770548 2570 scope.go:117] "RemoveContainer" containerID="64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a" Apr 28 19:51:26.770841 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:51:26.770823 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a\": container with ID starting with 64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a not found: ID does not exist" containerID="64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a" Apr 28 19:51:26.770924 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.770856 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a"} err="failed to get container status \"64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a\": rpc error: code = NotFound desc = could not find container \"64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a\": container with ID starting with 64a344bde415237abe3697647f7de515622426737058018b27d3aa0a3a3e662a not found: ID does not exist" Apr 28 19:51:26.770924 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.770879 2570 scope.go:117] "RemoveContainer" containerID="6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911" Apr 28 19:51:26.771137 ip-10-0-138-34 kubenswrapper[2570]: E0428 19:51:26.771119 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911\": container with ID starting with 6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911 not found: ID does not exist" containerID="6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911" Apr 28 19:51:26.771175 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.771143 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911"} err="failed to get container status \"6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911\": rpc error: code = NotFound desc = could not find container \"6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911\": container with ID starting with 6a44b9cdc43069e1e564df9c7be2cf365816405901add51cd97b803e3fa88911 not found: ID does not exist" Apr 28 19:51:26.776495 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.776471 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7"] Apr 28 19:51:26.779359 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:26.779338 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6bad9-predictor-6c4b946b59-klxc7"] Apr 28 19:51:27.888071 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:27.888041 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" path="/var/lib/kubelet/pods/f62e4801-d6f5-4881-b82f-c90cae86b77c/volumes" Apr 28 19:51:30.649899 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:30.649857 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 28 19:51:31.760369 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:31.760332 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:51:31.760921 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:31.760894 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 28 19:51:40.650801 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:40.650768 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 19:51:41.760852 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:41.760812 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 28 19:51:51.760848 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:51:51.760807 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 28 19:52:01.761763 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:52:01.761720 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 28 19:52:11.761762 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:52:11.761730 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 19:56:13.974102 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:56:13.974075 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 19:56:13.979971 ip-10-0-138-34 kubenswrapper[2570]: I0428 19:56:13.979950 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 20:00:37.962087 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:37.962050 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr"] Apr 28 20:00:37.962658 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:37.962338 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" containerID="cri-o://c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a" gracePeriod=30 Apr 28 20:00:37.962658 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:37.962389 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kube-rbac-proxy" containerID="cri-o://2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8" gracePeriod=30 Apr 28 20:00:38.574369 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:38.574330 2570 generic.go:358] "Generic (PLEG): container finished" podID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerID="2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8" exitCode=2 Apr 28 20:00:38.574543 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:38.574405 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" event={"ID":"7f19c733-98a3-4e90-8c29-4bd9394f8d68","Type":"ContainerDied","Data":"2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8"} Apr 28 20:00:41.310076 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.310052 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 20:00:41.416962 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.416873 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f19c733-98a3-4e90-8c29-4bd9394f8d68-proxy-tls\") pod \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " Apr 28 20:00:41.417117 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.416975 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f19c733-98a3-4e90-8c29-4bd9394f8d68-error-404-isvc-93cee-kube-rbac-proxy-sar-config\") pod \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " Apr 28 20:00:41.417117 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.417003 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j54nf\" (UniqueName: \"kubernetes.io/projected/7f19c733-98a3-4e90-8c29-4bd9394f8d68-kube-api-access-j54nf\") pod \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\" (UID: \"7f19c733-98a3-4e90-8c29-4bd9394f8d68\") " Apr 28 20:00:41.417326 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.417301 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f19c733-98a3-4e90-8c29-4bd9394f8d68-error-404-isvc-93cee-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-93cee-kube-rbac-proxy-sar-config") pod "7f19c733-98a3-4e90-8c29-4bd9394f8d68" (UID: "7f19c733-98a3-4e90-8c29-4bd9394f8d68"). InnerVolumeSpecName "error-404-isvc-93cee-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:00:41.419012 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.418980 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f19c733-98a3-4e90-8c29-4bd9394f8d68-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7f19c733-98a3-4e90-8c29-4bd9394f8d68" (UID: "7f19c733-98a3-4e90-8c29-4bd9394f8d68"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:00:41.419116 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.419078 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f19c733-98a3-4e90-8c29-4bd9394f8d68-kube-api-access-j54nf" (OuterVolumeSpecName: "kube-api-access-j54nf") pod "7f19c733-98a3-4e90-8c29-4bd9394f8d68" (UID: "7f19c733-98a3-4e90-8c29-4bd9394f8d68"). InnerVolumeSpecName "kube-api-access-j54nf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:00:41.517503 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.517468 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j54nf\" (UniqueName: \"kubernetes.io/projected/7f19c733-98a3-4e90-8c29-4bd9394f8d68-kube-api-access-j54nf\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 20:00:41.517503 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.517497 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f19c733-98a3-4e90-8c29-4bd9394f8d68-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 20:00:41.517786 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.517515 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f19c733-98a3-4e90-8c29-4bd9394f8d68-error-404-isvc-93cee-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 20:00:41.586153 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.586122 2570 generic.go:358] "Generic (PLEG): container finished" podID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerID="c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a" exitCode=0 Apr 28 20:00:41.586315 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.586175 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" event={"ID":"7f19c733-98a3-4e90-8c29-4bd9394f8d68","Type":"ContainerDied","Data":"c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a"} Apr 28 20:00:41.586315 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.586197 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" event={"ID":"7f19c733-98a3-4e90-8c29-4bd9394f8d68","Type":"ContainerDied","Data":"9a73e7464ed320ec4d1c8ae008524eed3e33c7ce4fde9dd252b47cfaad2f57c1"} Apr 28 20:00:41.586315 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.586196 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr" Apr 28 20:00:41.586315 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.586209 2570 scope.go:117] "RemoveContainer" containerID="2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8" Apr 28 20:00:41.594420 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.594401 2570 scope.go:117] "RemoveContainer" containerID="c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a" Apr 28 20:00:41.601331 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.601309 2570 scope.go:117] "RemoveContainer" containerID="2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8" Apr 28 20:00:41.601563 ip-10-0-138-34 kubenswrapper[2570]: E0428 20:00:41.601544 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8\": container with ID starting with 2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8 not found: ID does not exist" containerID="2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8" Apr 28 20:00:41.601634 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.601573 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8"} err="failed to get container status \"2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8\": rpc error: code = NotFound desc = could not find container \"2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8\": container with ID starting with 2f3b1cc6e2f04b8096e360a7cde63576381389ff15833b3bfdc6b694f72289b8 not found: ID does not exist" Apr 28 20:00:41.601634 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.601591 2570 scope.go:117] "RemoveContainer" containerID="c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a" Apr 28 20:00:41.601856 ip-10-0-138-34 kubenswrapper[2570]: E0428 20:00:41.601840 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a\": container with ID starting with c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a not found: ID does not exist" containerID="c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a" Apr 28 20:00:41.601906 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.601861 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a"} err="failed to get container status \"c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a\": rpc error: code = NotFound desc = could not find container \"c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a\": container with ID starting with c3ffb91dfa1c9eca78371c9bdbf94ef49dc30c7869e5f38f6f8901fed8ed731a not found: ID does not exist" Apr 28 20:00:41.606446 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.606424 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr"] Apr 28 20:00:41.611984 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.611962 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-93cee-predictor-787774f689-d62cr"] Apr 28 20:00:41.887447 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:00:41.887415 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" path="/var/lib/kubelet/pods/7f19c733-98a3-4e90-8c29-4bd9394f8d68/volumes" Apr 28 20:01:13.996188 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:01:13.996077 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 20:01:14.004163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:01:14.004134 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 20:06:14.018787 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:06:14.018756 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 20:06:14.026902 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:06:14.026880 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 20:08:11.973630 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:11.973579 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7"] Apr 28 20:08:11.974208 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:11.973985 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" containerID="cri-o://005d6b1921492f5b5ad78674ef49064e77a8a54031f9e9572846c4b1a5685800" gracePeriod=30 Apr 28 20:08:11.974208 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:11.974050 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kube-rbac-proxy" containerID="cri-o://de117244b34d729696f5186157f7a21f95177be1fd8070b9510513387b5d5517" gracePeriod=30 Apr 28 20:08:12.949007 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:12.948968 2570 generic.go:358] "Generic (PLEG): container finished" podID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerID="de117244b34d729696f5186157f7a21f95177be1fd8070b9510513387b5d5517" exitCode=2 Apr 28 20:08:12.949179 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:12.949021 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" event={"ID":"09058041-a3b9-4649-94c5-2f2bbc9e451c","Type":"ContainerDied","Data":"de117244b34d729696f5186157f7a21f95177be1fd8070b9510513387b5d5517"} Apr 28 20:08:14.959686 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:14.959653 2570 generic.go:358] "Generic (PLEG): container finished" podID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerID="005d6b1921492f5b5ad78674ef49064e77a8a54031f9e9572846c4b1a5685800" exitCode=0 Apr 28 20:08:14.960027 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:14.959710 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" event={"ID":"09058041-a3b9-4649-94c5-2f2bbc9e451c","Type":"ContainerDied","Data":"005d6b1921492f5b5ad78674ef49064e77a8a54031f9e9572846c4b1a5685800"} Apr 28 20:08:15.015161 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.015136 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 20:08:15.145018 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.144915 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzqlk\" (UniqueName: \"kubernetes.io/projected/09058041-a3b9-4649-94c5-2f2bbc9e451c-kube-api-access-hzqlk\") pod \"09058041-a3b9-4649-94c5-2f2bbc9e451c\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " Apr 28 20:08:15.145018 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.144990 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09058041-a3b9-4649-94c5-2f2bbc9e451c-proxy-tls\") pod \"09058041-a3b9-4649-94c5-2f2bbc9e451c\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " Apr 28 20:08:15.145252 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.145059 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09058041-a3b9-4649-94c5-2f2bbc9e451c-error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\") pod \"09058041-a3b9-4649-94c5-2f2bbc9e451c\" (UID: \"09058041-a3b9-4649-94c5-2f2bbc9e451c\") " Apr 28 20:08:15.145440 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.145407 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09058041-a3b9-4649-94c5-2f2bbc9e451c-error-404-isvc-e5e5a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-e5e5a-kube-rbac-proxy-sar-config") pod "09058041-a3b9-4649-94c5-2f2bbc9e451c" (UID: "09058041-a3b9-4649-94c5-2f2bbc9e451c"). InnerVolumeSpecName "error-404-isvc-e5e5a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:08:15.147059 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.147038 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09058041-a3b9-4649-94c5-2f2bbc9e451c-kube-api-access-hzqlk" (OuterVolumeSpecName: "kube-api-access-hzqlk") pod "09058041-a3b9-4649-94c5-2f2bbc9e451c" (UID: "09058041-a3b9-4649-94c5-2f2bbc9e451c"). InnerVolumeSpecName "kube-api-access-hzqlk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:08:15.147152 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.147134 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09058041-a3b9-4649-94c5-2f2bbc9e451c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "09058041-a3b9-4649-94c5-2f2bbc9e451c" (UID: "09058041-a3b9-4649-94c5-2f2bbc9e451c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:08:15.245866 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.245828 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hzqlk\" (UniqueName: \"kubernetes.io/projected/09058041-a3b9-4649-94c5-2f2bbc9e451c-kube-api-access-hzqlk\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 20:08:15.245866 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.245857 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09058041-a3b9-4649-94c5-2f2bbc9e451c-proxy-tls\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 20:08:15.245866 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.245869 2570 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09058041-a3b9-4649-94c5-2f2bbc9e451c-error-404-isvc-e5e5a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-34.ec2.internal\" DevicePath \"\"" Apr 28 20:08:15.963651 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.963601 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" event={"ID":"09058041-a3b9-4649-94c5-2f2bbc9e451c","Type":"ContainerDied","Data":"ea9e6fc1d8ae93347cbdb9206d64ec497ad02b441cce85a5a09de444b0506514"} Apr 28 20:08:15.964037 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.963661 2570 scope.go:117] "RemoveContainer" containerID="de117244b34d729696f5186157f7a21f95177be1fd8070b9510513387b5d5517" Apr 28 20:08:15.964037 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.963629 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7" Apr 28 20:08:15.971524 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.971502 2570 scope.go:117] "RemoveContainer" containerID="005d6b1921492f5b5ad78674ef49064e77a8a54031f9e9572846c4b1a5685800" Apr 28 20:08:15.982762 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.982733 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7"] Apr 28 20:08:15.986901 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:15.986879 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e5e5a-predictor-55d7d95565-l7vg7"] Apr 28 20:08:17.887401 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:17.887368 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" path="/var/lib/kubelet/pods/09058041-a3b9-4649-94c5-2f2bbc9e451c/volumes" Apr 28 20:08:40.561817 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.561737 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6fg8/must-gather-pfzqj"] Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562068 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562079 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562087 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kube-rbac-proxy" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562093 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kube-rbac-proxy" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562108 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562114 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562125 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kube-rbac-proxy" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562131 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kube-rbac-proxy" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562138 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562143 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562153 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kube-rbac-proxy" Apr 28 20:08:40.562163 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562158 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kube-rbac-proxy" Apr 28 20:08:40.562527 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562200 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kube-rbac-proxy" Apr 28 20:08:40.562527 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562208 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kube-rbac-proxy" Apr 28 20:08:40.562527 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562215 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f62e4801-d6f5-4881-b82f-c90cae86b77c" containerName="kserve-container" Apr 28 20:08:40.562527 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562223 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kube-rbac-proxy" Apr 28 20:08:40.562527 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562229 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="09058041-a3b9-4649-94c5-2f2bbc9e451c" containerName="kserve-container" Apr 28 20:08:40.562527 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.562236 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f19c733-98a3-4e90-8c29-4bd9394f8d68" containerName="kserve-container" Apr 28 20:08:40.565225 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.565210 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6fg8/must-gather-pfzqj" Apr 28 20:08:40.567758 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.567735 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t6fg8\"/\"openshift-service-ca.crt\"" Apr 28 20:08:40.567874 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.567767 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t6fg8\"/\"kube-root-ca.crt\"" Apr 28 20:08:40.568029 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.568009 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-t6fg8\"/\"default-dockercfg-494nn\"" Apr 28 20:08:40.576649 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.576583 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6fg8/must-gather-pfzqj"] Apr 28 20:08:40.656403 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.656369 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ed1922c9-4669-45c3-8606-15dd546b5dca-must-gather-output\") pod \"must-gather-pfzqj\" (UID: \"ed1922c9-4669-45c3-8606-15dd546b5dca\") " pod="openshift-must-gather-t6fg8/must-gather-pfzqj" Apr 28 20:08:40.656577 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.656435 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kft9\" (UniqueName: \"kubernetes.io/projected/ed1922c9-4669-45c3-8606-15dd546b5dca-kube-api-access-5kft9\") pod \"must-gather-pfzqj\" (UID: \"ed1922c9-4669-45c3-8606-15dd546b5dca\") " pod="openshift-must-gather-t6fg8/must-gather-pfzqj" Apr 28 20:08:40.757666 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.757597 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kft9\" (UniqueName: \"kubernetes.io/projected/ed1922c9-4669-45c3-8606-15dd546b5dca-kube-api-access-5kft9\") pod \"must-gather-pfzqj\" (UID: \"ed1922c9-4669-45c3-8606-15dd546b5dca\") " pod="openshift-must-gather-t6fg8/must-gather-pfzqj" Apr 28 20:08:40.757822 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.757694 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ed1922c9-4669-45c3-8606-15dd546b5dca-must-gather-output\") pod \"must-gather-pfzqj\" (UID: \"ed1922c9-4669-45c3-8606-15dd546b5dca\") " pod="openshift-must-gather-t6fg8/must-gather-pfzqj" Apr 28 20:08:40.758026 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.758006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ed1922c9-4669-45c3-8606-15dd546b5dca-must-gather-output\") pod \"must-gather-pfzqj\" (UID: \"ed1922c9-4669-45c3-8606-15dd546b5dca\") " pod="openshift-must-gather-t6fg8/must-gather-pfzqj" Apr 28 20:08:40.767299 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.767275 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kft9\" (UniqueName: \"kubernetes.io/projected/ed1922c9-4669-45c3-8606-15dd546b5dca-kube-api-access-5kft9\") pod \"must-gather-pfzqj\" (UID: \"ed1922c9-4669-45c3-8606-15dd546b5dca\") " pod="openshift-must-gather-t6fg8/must-gather-pfzqj" Apr 28 20:08:40.873985 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.873899 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6fg8/must-gather-pfzqj" Apr 28 20:08:40.994673 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.994649 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6fg8/must-gather-pfzqj"] Apr 28 20:08:40.996754 ip-10-0-138-34 kubenswrapper[2570]: W0428 20:08:40.996729 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded1922c9_4669_45c3_8606_15dd546b5dca.slice/crio-db166e35d7dcc6b67fb2710eb522b265a8661f1e3ee0055edf28a5c913be306a WatchSource:0}: Error finding container db166e35d7dcc6b67fb2710eb522b265a8661f1e3ee0055edf28a5c913be306a: Status 404 returned error can't find the container with id db166e35d7dcc6b67fb2710eb522b265a8661f1e3ee0055edf28a5c913be306a Apr 28 20:08:40.998455 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:40.998438 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:08:41.039558 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:41.039523 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6fg8/must-gather-pfzqj" event={"ID":"ed1922c9-4669-45c3-8606-15dd546b5dca","Type":"ContainerStarted","Data":"db166e35d7dcc6b67fb2710eb522b265a8661f1e3ee0055edf28a5c913be306a"} Apr 28 20:08:42.044515 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:42.044485 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6fg8/must-gather-pfzqj" event={"ID":"ed1922c9-4669-45c3-8606-15dd546b5dca","Type":"ContainerStarted","Data":"85372ee042f9b6f267ef64fdd3b47aeb5244589e2bc2859fb70d1e3dbc0953fb"} Apr 28 20:08:43.049799 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:43.049756 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6fg8/must-gather-pfzqj" event={"ID":"ed1922c9-4669-45c3-8606-15dd546b5dca","Type":"ContainerStarted","Data":"8edd2438ba556ef735cec2800158db792555a70904dfc48c1e8e3c1dd77fae31"} Apr 28 20:08:43.067900 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:43.067842 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6fg8/must-gather-pfzqj" podStartSLOduration=2.150935616 podStartE2EDuration="3.067825816s" podCreationTimestamp="2026-04-28 20:08:40 +0000 UTC" firstStartedPulling="2026-04-28 20:08:40.998565576 +0000 UTC m=+3147.750115069" lastFinishedPulling="2026-04-28 20:08:41.915455775 +0000 UTC m=+3148.667005269" observedRunningTime="2026-04-28 20:08:43.065792879 +0000 UTC m=+3149.817342393" watchObservedRunningTime="2026-04-28 20:08:43.067825816 +0000 UTC m=+3149.819375358" Apr 28 20:08:43.478932 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:43.478865 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-z27h7_d156b1b1-0d6c-49bf-b188-773ff892fbd2/global-pull-secret-syncer/0.log" Apr 28 20:08:43.638790 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:43.638759 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-prm99_04deb9df-0bdd-4917-a722-41b25477b851/konnectivity-agent/0.log" Apr 28 20:08:43.664578 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:43.664549 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-34.ec2.internal_b985e5bf011f3a2cbd34f60280660243/haproxy/0.log" Apr 28 20:08:46.780559 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:46.780530 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0d2bf842-725c-468d-abc9-ea477e7ed9e8/alertmanager/0.log" Apr 28 20:08:46.819307 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:46.819276 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0d2bf842-725c-468d-abc9-ea477e7ed9e8/config-reloader/0.log" Apr 28 20:08:46.866801 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:46.866771 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0d2bf842-725c-468d-abc9-ea477e7ed9e8/kube-rbac-proxy-web/0.log" Apr 28 20:08:46.917779 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:46.917747 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0d2bf842-725c-468d-abc9-ea477e7ed9e8/kube-rbac-proxy/0.log" Apr 28 20:08:46.967338 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:46.967284 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0d2bf842-725c-468d-abc9-ea477e7ed9e8/kube-rbac-proxy-metric/0.log" Apr 28 20:08:47.017949 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.017916 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0d2bf842-725c-468d-abc9-ea477e7ed9e8/prom-label-proxy/0.log" Apr 28 20:08:47.066882 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.066784 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0d2bf842-725c-468d-abc9-ea477e7ed9e8/init-config-reloader/0.log" Apr 28 20:08:47.174453 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.174425 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-x7nz2_58c6830d-7e4f-4ef7-98a1-4f0f5d45500d/kube-state-metrics/0.log" Apr 28 20:08:47.217484 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.217453 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-x7nz2_58c6830d-7e4f-4ef7-98a1-4f0f5d45500d/kube-rbac-proxy-main/0.log" Apr 28 20:08:47.269679 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.269644 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-x7nz2_58c6830d-7e4f-4ef7-98a1-4f0f5d45500d/kube-rbac-proxy-self/0.log" Apr 28 20:08:47.323584 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.323500 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-64d997c947-qw75t_0b855502-4514-43b3-83c4-664b35b6bddc/metrics-server/0.log" Apr 28 20:08:47.372846 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.372817 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-p9df6_cc240119-b9e8-4033-8e7d-35cf4ddb7f18/monitoring-plugin/0.log" Apr 28 20:08:47.543307 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.543271 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9zbws_4a04ffd3-70ad-4ce7-ab98-6e69013d5119/node-exporter/0.log" Apr 28 20:08:47.589096 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.589017 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9zbws_4a04ffd3-70ad-4ce7-ab98-6e69013d5119/kube-rbac-proxy/0.log" Apr 28 20:08:47.636795 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:47.636764 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9zbws_4a04ffd3-70ad-4ce7-ab98-6e69013d5119/init-textfile/0.log" Apr 28 20:08:48.202847 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:48.202814 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-655b965ffc-gvwdf_6771cfc5-48ff-4141-9251-f539854216fc/telemeter-client/0.log" Apr 28 20:08:48.238891 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:48.238830 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-655b965ffc-gvwdf_6771cfc5-48ff-4141-9251-f539854216fc/reload/0.log" Apr 28 20:08:48.268985 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:48.268961 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-655b965ffc-gvwdf_6771cfc5-48ff-4141-9251-f539854216fc/kube-rbac-proxy/0.log" Apr 28 20:08:49.465140 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:49.465115 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-x865s_7e4b5b64-ebf0-4ee2-a43b-35098459ff73/networking-console-plugin/0.log" Apr 28 20:08:50.342740 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.342705 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p"] Apr 28 20:08:50.347680 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.347653 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.355297 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.355269 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p"] Apr 28 20:08:50.441147 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.441113 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kdqj\" (UniqueName: \"kubernetes.io/projected/9735168b-1dbe-4fcf-9227-16f2339844a1-kube-api-access-8kdqj\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.441317 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.441161 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-lib-modules\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.441317 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.441222 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-sys\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.441317 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.441279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-proc\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.441317 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.441308 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-podres\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.542180 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.542142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-proc\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.542728 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.542359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-proc\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.542829 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.542745 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-podres\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.542829 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.542817 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kdqj\" (UniqueName: \"kubernetes.io/projected/9735168b-1dbe-4fcf-9227-16f2339844a1-kube-api-access-8kdqj\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.542926 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.542845 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-lib-modules\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.542926 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.542866 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-sys\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.542926 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.542891 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-podres\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.543073 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.542973 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-sys\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.543073 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.542995 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9735168b-1dbe-4fcf-9227-16f2339844a1-lib-modules\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.551591 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.551554 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kdqj\" (UniqueName: \"kubernetes.io/projected/9735168b-1dbe-4fcf-9227-16f2339844a1-kube-api-access-8kdqj\") pod \"perf-node-gather-daemonset-zc55p\" (UID: \"9735168b-1dbe-4fcf-9227-16f2339844a1\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.661676 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.661572 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:50.713412 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.713387 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-cqswc_c3b8f63d-3fda-4251-a306-6730ed6ac6d6/volume-data-source-validator/0.log" Apr 28 20:08:50.805892 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:50.805856 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p"] Apr 28 20:08:50.809714 ip-10-0-138-34 kubenswrapper[2570]: W0428 20:08:50.809687 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9735168b_1dbe_4fcf_9227_16f2339844a1.slice/crio-c3c5e11b17eb59aa48ee923f1d78d92df377bb876b88339fc0beca6633075509 WatchSource:0}: Error finding container c3c5e11b17eb59aa48ee923f1d78d92df377bb876b88339fc0beca6633075509: Status 404 returned error can't find the container with id c3c5e11b17eb59aa48ee923f1d78d92df377bb876b88339fc0beca6633075509 Apr 28 20:08:51.086683 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:51.086641 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" event={"ID":"9735168b-1dbe-4fcf-9227-16f2339844a1","Type":"ContainerStarted","Data":"485836a66fd4fddf39b9c3349421e7021af7abd419b5679a502c5c19c786201a"} Apr 28 20:08:51.086865 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:51.086692 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" event={"ID":"9735168b-1dbe-4fcf-9227-16f2339844a1","Type":"ContainerStarted","Data":"c3c5e11b17eb59aa48ee923f1d78d92df377bb876b88339fc0beca6633075509"} Apr 28 20:08:51.086865 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:51.086788 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:08:51.110719 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:51.110657 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" podStartSLOduration=1.110635041 podStartE2EDuration="1.110635041s" podCreationTimestamp="2026-04-28 20:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:08:51.109762672 +0000 UTC m=+3157.861312188" watchObservedRunningTime="2026-04-28 20:08:51.110635041 +0000 UTC m=+3157.862184556" Apr 28 20:08:51.442680 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:51.442586 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-98b4d_801789ef-975e-451d-9e18-0cb9acd739d6/dns/0.log" Apr 28 20:08:51.490666 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:51.490637 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-98b4d_801789ef-975e-451d-9e18-0cb9acd739d6/kube-rbac-proxy/0.log" Apr 28 20:08:51.675526 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:51.675494 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pmpq5_68ecdbce-5bdd-408b-bef6-91e797899886/dns-node-resolver/0.log" Apr 28 20:08:52.194063 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:52.194037 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nt8t9_becc1ca0-0b4e-43d1-95d2-8979c6dd35fd/node-ca/0.log" Apr 28 20:08:52.994340 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:52.994309 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d599bf786-52gkb_78cae5ab-43f1-4ebe-bc59-bac6aeca7b9a/router/0.log" Apr 28 20:08:53.364743 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:53.364713 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l8dqg_0d073d08-7217-4136-8485-03d574acfc52/serve-healthcheck-canary/0.log" Apr 28 20:08:53.774447 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:53.774413 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6t8k2_e6a46589-6d43-4418-acea-0a5d676870f3/kube-rbac-proxy/0.log" Apr 28 20:08:53.799783 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:53.799758 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6t8k2_e6a46589-6d43-4418-acea-0a5d676870f3/exporter/0.log" Apr 28 20:08:53.826533 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:53.826508 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6t8k2_e6a46589-6d43-4418-acea-0a5d676870f3/extractor/0.log" Apr 28 20:08:55.966806 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:55.966772 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-b85c69797-hj7b5_3f426977-d622-4e4f-af1c-2db016314ae7/manager/0.log" Apr 28 20:08:55.990214 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:55.990184 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-vrckr_040ed491-9a87-4fdb-b568-1cc93ef448d9/manager/0.log" Apr 28 20:08:56.506466 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:56.506431 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-f792c_3bd0ebf6-fcf4-4f72-825f-be5e630230a3/s3-init/0.log" Apr 28 20:08:57.101164 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:08:57.101135 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-zc55p" Apr 28 20:09:02.090773 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.090744 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g9c9n_1d508319-bdc2-4029-88e0-8b8406c4ac0b/kube-multus-additional-cni-plugins/0.log" Apr 28 20:09:02.122882 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.122854 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g9c9n_1d508319-bdc2-4029-88e0-8b8406c4ac0b/egress-router-binary-copy/0.log" Apr 28 20:09:02.148624 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.148582 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g9c9n_1d508319-bdc2-4029-88e0-8b8406c4ac0b/cni-plugins/0.log" Apr 28 20:09:02.172869 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.172843 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g9c9n_1d508319-bdc2-4029-88e0-8b8406c4ac0b/bond-cni-plugin/0.log" Apr 28 20:09:02.197974 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.197934 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g9c9n_1d508319-bdc2-4029-88e0-8b8406c4ac0b/routeoverride-cni/0.log" Apr 28 20:09:02.222522 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.222488 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g9c9n_1d508319-bdc2-4029-88e0-8b8406c4ac0b/whereabouts-cni-bincopy/0.log" Apr 28 20:09:02.246467 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.246436 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g9c9n_1d508319-bdc2-4029-88e0-8b8406c4ac0b/whereabouts-cni/0.log" Apr 28 20:09:02.681009 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.680974 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pppcc_106d173d-f66c-4772-a2c9-07f5c1dc8219/kube-multus/0.log" Apr 28 20:09:02.864402 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.864371 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qgcjb_0b961ce3-ed85-40f4-840c-df0e74d830dd/network-metrics-daemon/0.log" Apr 28 20:09:02.889856 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:02.889828 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qgcjb_0b961ce3-ed85-40f4-840c-df0e74d830dd/kube-rbac-proxy/0.log" Apr 28 20:09:04.046932 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:04.046894 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-controller/0.log" Apr 28 20:09:04.070482 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:04.070448 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/0.log" Apr 28 20:09:04.084646 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:04.084598 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovn-acl-logging/1.log" Apr 28 20:09:04.109779 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:04.109734 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/kube-rbac-proxy-node/0.log" Apr 28 20:09:04.136554 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:04.136526 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 20:09:04.162360 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:04.162327 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/northd/0.log" Apr 28 20:09:04.187079 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:04.187046 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/nbdb/0.log" Apr 28 20:09:04.212679 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:04.212651 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/sbdb/0.log" Apr 28 20:09:04.333666 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:04.333573 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xms2_21172191-de03-4932-85fe-40437ea0c56a/ovnkube-controller/0.log" Apr 28 20:09:05.650968 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:05.650934 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-8fxnw_25df0748-bd29-48db-925e-d566aa27fa14/check-endpoints/0.log" Apr 28 20:09:05.702932 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:05.702903 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-nc646_633bd943-3978-4baf-be3b-c82a70d85512/network-check-target-container/0.log" Apr 28 20:09:06.665588 ip-10-0-138-34 kubenswrapper[2570]: I0428 20:09:06.665559 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-c8f8g_e61cde97-43d8-438a-abf3-9be15c9fb8d0/iptables-alerter/0.log"