Apr 16 14:49:53.449672 ip-10-0-130-140 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 14:49:53.449687 ip-10-0-130-140 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 14:49:53.449696 ip-10-0-130-140 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 14:49:53.450001 ip-10-0-130-140 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 14:50:03.627212 ip-10-0-130-140 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 14:50:03.627229 ip-10-0-130-140 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d727c1ff82814860ba7d77e28fc72a9e -- Apr 16 14:52:28.937076 ip-10-0-130-140 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:29.486786 ip-10-0-130-140 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:29.486786 ip-10-0-130-140 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:29.486786 ip-10-0-130-140 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:29.486786 ip-10-0-130-140 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:29.486786 ip-10-0-130-140 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:29.488554 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.488462 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:29.490907 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490892 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.490907 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490906 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490910 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490913 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490916 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490920 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490923 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490926 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490928 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490931 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490934 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490937 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490939 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490942 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490944 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490947 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490950 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490952 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490955 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490957 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490967 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.490970 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490970 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490973 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490976 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490979 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490982 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490985 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490988 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490990 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490993 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490996 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.490999 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491001 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491004 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491006 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491009 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491011 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491014 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491016 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491019 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491021 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.491472 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491023 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491026 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491029 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491031 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491035 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491037 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491040 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491042 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491045 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491047 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491050 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491052 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491055 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491058 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491062 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491066 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491069 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491072 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491075 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491078 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.491982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491081 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491084 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491086 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491089 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491091 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491094 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491097 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491100 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491103 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491105 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491108 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491110 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491113 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491115 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491119 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491121 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491124 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491128 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491133 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.492469 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491136 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491139 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491142 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491145 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491147 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491150 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491573 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491579 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491582 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491585 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491588 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491590 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491593 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491596 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491599 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491601 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491604 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491608 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491611 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.492954 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491614 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491617 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491620 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491623 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491626 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491629 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491631 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491634 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491637 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491639 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491642 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491645 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491648 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491650 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491653 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491655 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491658 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491662 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491665 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.493400 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491669 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491672 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491675 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491677 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491680 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491682 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491685 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491687 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491690 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491692 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491695 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491697 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491700 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491702 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491705 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491708 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491710 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491712 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491715 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491718 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.493915 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491721 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491724 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491727 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491729 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491732 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491735 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491737 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491740 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491743 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491745 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491747 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491750 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491753 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491756 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491759 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491761 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491779 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491783 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491787 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491791 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.494405 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491794 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491798 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491800 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491803 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491805 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491809 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491812 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491814 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491817 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491819 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491822 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491824 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491827 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.491830 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491896 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491903 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491911 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491917 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491922 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491926 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491930 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:29.494905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491935 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491939 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491942 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491945 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491949 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491952 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491955 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491958 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491961 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491964 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491967 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491970 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491975 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491977 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491981 2576 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491984 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491987 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491992 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491995 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.491999 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492002 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492005 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492008 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492011 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492015 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:29.495413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492018 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492022 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492026 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492030 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492033 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492037 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492040 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492045 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492048 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492051 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492054 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492057 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492061 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492064 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492067 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492070 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492073 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492076 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492079 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492082 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492085 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492088 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492090 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492095 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492098 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:29.496021 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492105 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492108 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492111 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492115 2576 flags.go:64] FLAG: --help="false" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492118 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-130-140.ec2.internal" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492121 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492124 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492127 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492131 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492135 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492142 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492146 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492149 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492152 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492155 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492158 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492161 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492164 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492167 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492170 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492173 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492176 2576 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492179 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492182 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:29.496650 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492185 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492190 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492193 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492196 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492198 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492201 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492205 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492208 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492212 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492216 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492219 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492224 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492227 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492230 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492232 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492235 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492238 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492242 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492244 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492253 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492256 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492259 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492262 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:29.497257 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492265 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492271 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492275 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492278 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492281 2576 flags.go:64] FLAG: --port="10250" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492284 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492287 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a1afc2a469bc29a2" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492290 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492293 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492296 2576 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492299 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492302 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492306 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492309 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492312 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492315 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492319 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492322 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492326 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492330 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492332 2576 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492335 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492338 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492341 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492344 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492347 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:29.497825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492352 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492355 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492358 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492362 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492365 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492368 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492371 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492374 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492377 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492380 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492386 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492389 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492392 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492396 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492398 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492401 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492404 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492407 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492410 2576 flags.go:64] FLAG: --v="2" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492414 2576 flags.go:64] FLAG: --version="false" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492418 2576 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492422 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.492426 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492515 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.498456 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492520 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492523 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492526 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492529 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492532 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492535 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492537 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492540 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492544 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492546 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492549 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492552 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492555 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492558 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492560 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492563 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492566 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492568 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492571 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492574 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.499039 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492577 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492580 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492584 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492587 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492590 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492593 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492596 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492599 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492601 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492604 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492607 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492610 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492615 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492618 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492621 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492624 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492626 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492629 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492632 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.499578 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492634 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492641 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492644 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492647 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492649 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492652 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492655 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492658 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492660 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492663 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492666 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492668 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492671 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492694 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492698 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492701 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492704 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492707 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492710 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492713 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.500049 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492716 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492718 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492721 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492723 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492726 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492731 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492734 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492736 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492739 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492742 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492744 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492747 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492749 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492753 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492756 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492759 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492761 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492778 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492780 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.500544 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492784 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.501048 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492786 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.501048 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492789 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.501048 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492792 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.501048 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492794 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.501048 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492797 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.501048 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.492800 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.501048 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.493528 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:29.502693 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.502675 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:29.502731 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.502695 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:29.502761 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502743 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.502761 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502748 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.502761 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502751 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.502761 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502755 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.502761 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502758 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.502761 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502761 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.502761 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502775 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502779 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502781 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502784 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502787 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502790 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502793 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502795 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502798 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502801 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502804 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502807 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502809 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502812 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502815 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502817 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502820 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502823 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502826 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502828 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.502956 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502831 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502834 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502836 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502839 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502842 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502846 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502848 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502851 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502854 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502857 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502859 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502862 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502864 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502867 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502870 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502873 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502876 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502878 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502881 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502883 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.503437 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502886 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502889 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502891 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502894 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502897 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502899 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502902 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502904 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502907 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502910 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502912 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502915 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502917 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502920 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502922 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502925 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502928 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502931 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502934 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502936 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.503937 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502939 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502941 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502944 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502947 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502949 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502952 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502955 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502958 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502961 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502964 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502967 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502969 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502972 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502975 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502979 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502983 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502986 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502991 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.502998 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.504446 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503001 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.503007 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503109 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503114 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503117 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503120 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503122 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503125 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503129 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503131 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503134 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503137 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503140 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503143 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503145 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.504931 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503148 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503150 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503153 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503156 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503159 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503163 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503166 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503170 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503172 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503175 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503178 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503181 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503183 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503186 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503188 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503191 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503194 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503197 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503199 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.505302 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503202 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503204 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503207 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503210 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503212 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503215 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503218 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503220 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503223 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503226 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503228 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503231 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503234 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503236 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503239 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503242 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503244 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503247 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503250 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503254 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.505791 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503258 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503261 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503264 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503267 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503270 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503273 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503276 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503278 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503282 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503285 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503287 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503290 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503292 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503295 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503297 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503300 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503303 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503305 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503308 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503310 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.506312 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503313 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503315 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503318 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503321 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503324 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503327 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503329 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503332 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503335 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503337 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503341 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503343 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503346 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:29.503349 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.503353 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:29.506816 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.504697 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:29.507912 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.507898 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:29.508974 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.508963 2576 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:29.509075 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.509057 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:29.509111 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.509101 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:29.539452 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.539431 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:29.545701 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.545676 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:29.565233 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.565212 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:29.567164 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.567142 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:29.571648 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.571633 2576 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:29.573029 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.573015 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:29.577599 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.577572 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c6088661-dc52-4c91-b6a4-dba622739c10:/dev/nvme0n1p4 c80e8f8b-4c75-47d5-ae09-be25094cde24:/dev/nvme0n1p3] Apr 16 14:52:29.577651 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.577599 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:29.583361 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.583254 2576 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:29.581656487 +0000 UTC m=+0.500017684 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3085435 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fb6f8a434ca8d0dcfdf1454604570 SystemUUID:ec2fb6f8-a434-ca8d-0dcf-df1454604570 BootID:d727c1ff-8281-4860-ba7d-77e28fc72a9e Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:85:5c:b9:b2:fd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:85:5c:b9:b2:fd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:d8:9f:eb:f7:38 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:29.583361 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.583356 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:29.583485 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.583435 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:29.585098 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.585061 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:29.585233 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.585100 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-140.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:29.585275 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.585244 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:29.585275 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.585252 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:29.585275 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.585265 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:29.585354 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.585279 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:29.586488 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.586478 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:29.586593 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.586585 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:29.590008 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.589999 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:29.590047 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.590012 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:29.590867 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.590857 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:29.590910 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.590870 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:29.590910 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.590880 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:29.591892 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.591878 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:29.591930 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.591911 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:29.598871 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.598851 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:29.600980 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.600963 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:29.602655 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602639 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602659 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602668 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602675 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602683 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602692 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602700 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602708 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602718 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602728 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:29.602747 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602740 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:29.603065 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.602832 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:29.603823 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.603813 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:29.603872 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.603826 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:29.607274 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.607254 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-140.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:29.607361 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.607275 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:29.607361 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.607328 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-140.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:29.607361 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.607337 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:29.607498 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.607376 2576 server.go:1295] "Started kubelet" Apr 16 14:52:29.607543 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.607465 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:29.607587 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.607512 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:29.607587 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.607569 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:29.608280 ip-10-0-130-140 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:29.608646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.608629 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:29.608726 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.608693 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:29.615464 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.614137 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-140.ec2.internal.18a6ddf584b3e9fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-140.ec2.internal,UID:ip-10-0-130-140.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-140.ec2.internal,},FirstTimestamp:2026-04-16 14:52:29.607348732 +0000 UTC m=+0.525709929,LastTimestamp:2026-04-16 14:52:29.607348732 +0000 UTC m=+0.525709929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-140.ec2.internal,}" Apr 16 14:52:29.617721 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.617700 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:29.618537 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.618523 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:29.619147 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619130 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:29.619147 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619148 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:29.619250 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619158 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:29.619250 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619173 2576 factory.go:55] Registering systemd factory Apr 16 14:52:29.619250 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619184 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:29.619350 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619263 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:29.619350 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.619272 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:29.619350 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619322 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:29.619350 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619329 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:29.619513 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619448 2576 factory.go:153] Registering CRI-O factory Apr 16 14:52:29.619513 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619463 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:29.619513 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619496 2576 factory.go:103] Registering Raw factory Apr 16 14:52:29.619513 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619511 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:29.619689 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.619628 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:29.621474 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.619910 2576 manager.go:319] Starting recovery of all containers Apr 16 14:52:29.621877 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.621713 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 14:52:29.621877 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.621737 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-140.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 14:52:29.632466 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.632449 2576 manager.go:324] Recovery completed Apr 16 14:52:29.636449 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.636437 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.639835 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.639813 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.639905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.639847 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.639905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.639858 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.640355 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.640342 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:29.640421 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.640355 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:29.640421 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.640374 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:29.641249 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.641187 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-140.ec2.internal.18a6ddf586a39a97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-140.ec2.internal,UID:ip-10-0-130-140.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-140.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-140.ec2.internal,},FirstTimestamp:2026-04-16 14:52:29.639834263 +0000 UTC m=+0.558195459,LastTimestamp:2026-04-16 14:52:29.639834263 +0000 UTC m=+0.558195459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-140.ec2.internal,}" Apr 16 14:52:29.643548 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.643533 2576 policy_none.go:49] "None policy: Start" Apr 16 14:52:29.643617 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.643553 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:29.643617 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.643567 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:29.644706 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.644684 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ffszq" Apr 16 14:52:29.651487 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.651470 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ffszq" Apr 16 14:52:29.651555 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.651476 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-140.ec2.internal.18a6ddf586a3def8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-140.ec2.internal,UID:ip-10-0-130-140.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-140.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-140.ec2.internal,},FirstTimestamp:2026-04-16 14:52:29.639851768 +0000 UTC m=+0.558212964,LastTimestamp:2026-04-16 14:52:29.639851768 +0000 UTC m=+0.558212964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-140.ec2.internal,}" Apr 16 14:52:29.687322 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.687307 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:29.706374 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.687348 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:29.706374 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.687361 2576 server.go:85] "Starting device plugin registration server" Apr 16 14:52:29.706374 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.687581 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:29.706374 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.687590 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:29.706374 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.687689 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:29.706374 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.687816 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:29.706374 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.687829 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:29.706374 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.688289 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:29.706374 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.688334 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:29.787929 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.787870 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.789460 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.789445 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.789546 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.789479 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.789546 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.789494 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.789643 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.789550 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-140.ec2.internal" Apr 16 14:52:29.797808 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.797792 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-140.ec2.internal" Apr 16 14:52:29.797874 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.797835 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-140.ec2.internal\": node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:29.799614 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.799589 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:29.800900 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.800884 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:29.800990 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.800916 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:29.800990 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.800937 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:29.800990 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.800947 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:29.800990 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.800980 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:29.803028 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.803011 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:29.810031 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.810015 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:29.901846 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.901811 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal"] Apr 16 14:52:29.901945 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.901896 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.904139 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.904122 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.904220 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.904152 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.904220 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.904165 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.906607 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.906594 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.906744 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.906732 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" Apr 16 14:52:29.906803 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.906758 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.907330 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.907306 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.907393 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.907335 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.907393 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.907312 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.907393 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.907370 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.907393 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.907382 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.907393 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.907347 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.910072 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.910054 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:29.910144 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.910131 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal" Apr 16 14:52:29.910189 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.910153 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.911470 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.911456 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.911523 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.911481 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.911523 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.911491 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.920879 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.920853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/13d086a02172e63757658cc95fdeac2d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal\" (UID: \"13d086a02172e63757658cc95fdeac2d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" Apr 16 14:52:29.920975 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.920891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13d086a02172e63757658cc95fdeac2d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal\" (UID: \"13d086a02172e63757658cc95fdeac2d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" Apr 16 14:52:29.920975 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:29.920917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8e5adc7c4d5729c56cf31736cd00ffbf-config\") pod \"kube-apiserver-proxy-ip-10-0-130-140.ec2.internal\" (UID: \"8e5adc7c4d5729c56cf31736cd00ffbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal" Apr 16 14:52:29.935353 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.935331 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-140.ec2.internal\" not found" node="ip-10-0-130-140.ec2.internal" Apr 16 14:52:29.939649 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:29.939634 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-140.ec2.internal\" not found" node="ip-10-0-130-140.ec2.internal" Apr 16 14:52:30.010848 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.010803 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.021165 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.021140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/13d086a02172e63757658cc95fdeac2d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal\" (UID: \"13d086a02172e63757658cc95fdeac2d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" Apr 16 14:52:30.021256 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.021173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13d086a02172e63757658cc95fdeac2d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal\" (UID: \"13d086a02172e63757658cc95fdeac2d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" Apr 16 14:52:30.021256 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.021200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8e5adc7c4d5729c56cf31736cd00ffbf-config\") pod \"kube-apiserver-proxy-ip-10-0-130-140.ec2.internal\" (UID: \"8e5adc7c4d5729c56cf31736cd00ffbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal" Apr 16 14:52:30.021256 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.021244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8e5adc7c4d5729c56cf31736cd00ffbf-config\") pod \"kube-apiserver-proxy-ip-10-0-130-140.ec2.internal\" (UID: \"8e5adc7c4d5729c56cf31736cd00ffbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal" Apr 16 14:52:30.021256 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.021244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/13d086a02172e63757658cc95fdeac2d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal\" (UID: \"13d086a02172e63757658cc95fdeac2d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" Apr 16 14:52:30.021410 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.021252 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13d086a02172e63757658cc95fdeac2d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal\" (UID: \"13d086a02172e63757658cc95fdeac2d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" Apr 16 14:52:30.111577 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.111516 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.211995 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.211962 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.239504 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.239478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" Apr 16 14:52:30.242032 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.241996 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal" Apr 16 14:52:30.312713 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.312677 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.413263 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.413175 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.508674 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.508646 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:30.509234 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.508802 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:30.513831 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.513808 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.614890 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.614866 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.618623 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.618600 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:30.628813 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.628787 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:30.646852 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.646834 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-c6jjl" Apr 16 14:52:30.651814 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.651796 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-c6jjl" Apr 16 14:52:30.653565 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.653544 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:29 +0000 UTC" deadline="2027-11-17 09:15:56.816119322 +0000 UTC" Apr 16 14:52:30.653625 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.653565 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13914h23m26.162556686s" Apr 16 14:52:30.715884 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.715816 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.792547 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:30.792517 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e5adc7c4d5729c56cf31736cd00ffbf.slice/crio-9644cabdbc203967f4b2b5d665e02f992a93683860cdaab7493e1603c44eb7ba WatchSource:0}: Error finding container 9644cabdbc203967f4b2b5d665e02f992a93683860cdaab7493e1603c44eb7ba: Status 404 returned error can't find the container with id 9644cabdbc203967f4b2b5d665e02f992a93683860cdaab7493e1603c44eb7ba Apr 16 14:52:30.792886 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:30.792870 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13d086a02172e63757658cc95fdeac2d.slice/crio-eaa989c78d7f99ffed2226cacbe3a0e72b3a229ab1c826f8b89d11f0dd14664e WatchSource:0}: Error finding container eaa989c78d7f99ffed2226cacbe3a0e72b3a229ab1c826f8b89d11f0dd14664e: Status 404 returned error can't find the container with id eaa989c78d7f99ffed2226cacbe3a0e72b3a229ab1c826f8b89d11f0dd14664e Apr 16 14:52:30.800339 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.800322 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:30.803526 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.803481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal" event={"ID":"8e5adc7c4d5729c56cf31736cd00ffbf","Type":"ContainerStarted","Data":"9644cabdbc203967f4b2b5d665e02f992a93683860cdaab7493e1603c44eb7ba"} Apr 16 14:52:30.804432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.804414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" event={"ID":"13d086a02172e63757658cc95fdeac2d","Type":"ContainerStarted","Data":"eaa989c78d7f99ffed2226cacbe3a0e72b3a229ab1c826f8b89d11f0dd14664e"} Apr 16 14:52:30.816239 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.816215 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.916361 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:30.916328 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-140.ec2.internal\" not found" Apr 16 14:52:30.918598 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.918579 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:30.957117 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:30.957094 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:31.019855 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.019790 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal" Apr 16 14:52:31.027664 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.027643 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:31.028565 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.028554 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" Apr 16 14:52:31.036473 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.036459 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:31.207696 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.207666 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:31.592046 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.591971 2576 apiserver.go:52] "Watching apiserver" Apr 16 14:52:31.598879 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.598857 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:31.599212 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.599190 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-lb2nq","openshift-network-diagnostics/network-check-target-c78tw","openshift-network-operator/iptables-alerter-c9tx6","openshift-ovn-kubernetes/ovnkube-node-r7dqw","kube-system/konnectivity-agent-5nclj","kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal","openshift-cluster-node-tuning-operator/tuned-vj7zr","openshift-dns/node-resolver-s9crx","openshift-image-registry/node-ca-tsg7q","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal","openshift-multus/multus-q29wv","openshift-multus/network-metrics-daemon-7wx6z","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz"] Apr 16 14:52:31.604850 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.604818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.606726 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.606700 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.606726 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.606725 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.606919 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.606863 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pnhl7\"" Apr 16 14:52:31.607313 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.607291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.610476 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.608822 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.610476 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.609698 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.610476 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.609817 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:31.610476 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.609982 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fj4s8\"" Apr 16 14:52:31.611792 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.611571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:31.611792 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:31.611642 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:31.611792 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.611740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.613709 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.613687 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:31.613883 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.613858 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:31.614219 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.614007 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.614361 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.614343 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.614526 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.614512 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:31.614939 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.614659 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:31.614939 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.614780 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rkf9w\"" Apr 16 14:52:31.614939 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.614857 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:31.615644 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.615502 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:31.615644 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.615618 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-26c42\"" Apr 16 14:52:31.615803 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.615701 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:31.616162 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.616144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.618150 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.618007 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.618150 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.618085 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6bv8c\"" Apr 16 14:52:31.618150 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.618133 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:31.618351 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.618172 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.618351 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.618185 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:31.618451 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.618432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.618528 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.618511 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:31.620284 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.619936 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.620284 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.620076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b5zzr\"" Apr 16 14:52:31.620440 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.620285 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.620908 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.620879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.622707 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.622689 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.622866 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.622820 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:31.623155 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.622995 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.623155 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.623059 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xftsh\"" Apr 16 14:52:31.625552 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.625414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.627246 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.627080 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:31.627246 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.627097 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jb7vq\"" Apr 16 14:52:31.627896 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.627874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:31.627989 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.627872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.628485 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:31.628452 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:31.629516 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629498 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.629725 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.629828 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629740 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-modprobe-d\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.629828 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jzmn\"" Apr 16 14:52:31.629828 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e330e52-07ab-4173-a692-bcf1bedd06ff-ovnkube-config\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.629828 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3323004a-60ab-45da-8ce1-47a7a8622df4-agent-certs\") pod \"konnectivity-agent-5nclj\" (UID: \"3323004a-60ab-45da-8ce1-47a7a8622df4\") " pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:31.630035 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-tuned\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.630035 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-run-openvswitch\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630035 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e330e52-07ab-4173-a692-bcf1bedd06ff-env-overrides\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630035 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629913 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:31.630035 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7df5be5d-8f4e-489e-95da-488d3220a4f7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.630035 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629944 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.630035 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.629944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-kubernetes\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.630035 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-sys\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.630543 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e330e52-07ab-4173-a692-bcf1bedd06ff-ovn-node-metrics-cert\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630614 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7df5be5d-8f4e-489e-95da-488d3220a4f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.630614 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx2mz\" (UniqueName: \"kubernetes.io/projected/7df5be5d-8f4e-489e-95da-488d3220a4f7-kube-api-access-dx2mz\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.630710 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-sysconfig\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.630710 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-os-release\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.630710 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-run-systemd\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630710 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-etc-openvswitch\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630710 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-cni-netd\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-sysctl-d\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-var-lib-openvswitch\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkhrz\" (UniqueName: \"kubernetes.io/projected/3cebcaa8-957d-4f1e-b4f8-90637dae2bc0-kube-api-access-kkhrz\") pod \"node-resolver-s9crx\" (UID: \"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0\") " pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-system-cni-dir\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-lib-modules\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-host\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-systemd-units\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54g89\" (UniqueName: \"kubernetes.io/projected/8e330e52-07ab-4173-a692-bcf1bedd06ff-kube-api-access-54g89\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3323004a-60ab-45da-8ce1-47a7a8622df4-konnectivity-ca\") pod \"konnectivity-agent-5nclj\" (UID: \"3323004a-60ab-45da-8ce1-47a7a8622df4\") " pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3cebcaa8-957d-4f1e-b4f8-90637dae2bc0-tmp-dir\") pod \"node-resolver-s9crx\" (UID: \"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0\") " pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-cnibin\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d47b69f-123e-49bb-8517-d2e2716ccea1-tmp\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.630972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c119f984-3b17-49d4-8d0b-37669cbcbeb7-iptables-alerter-script\") pod \"iptables-alerter-c9tx6\" (UID: \"c119f984-3b17-49d4-8d0b-37669cbcbeb7\") " pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.630995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3cebcaa8-957d-4f1e-b4f8-90637dae2bc0-hosts-file\") pod \"node-resolver-s9crx\" (UID: \"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0\") " pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzs4p\" (UniqueName: \"kubernetes.io/projected/e27e48da-a6dc-4e84-87f4-01916a11e065-kube-api-access-qzs4p\") pod \"node-ca-tsg7q\" (UID: \"e27e48da-a6dc-4e84-87f4-01916a11e065\") " pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-kubelet\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-node-log\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7df5be5d-8f4e-489e-95da-488d3220a4f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-var-lib-kubelet\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e27e48da-a6dc-4e84-87f4-01916a11e065-serviceca\") pod \"node-ca-tsg7q\" (UID: \"e27e48da-a6dc-4e84-87f4-01916a11e065\") " pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c119f984-3b17-49d4-8d0b-37669cbcbeb7-host-slash\") pod \"iptables-alerter-c9tx6\" (UID: \"c119f984-3b17-49d4-8d0b-37669cbcbeb7\") " pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldlhb\" (UniqueName: \"kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb\") pod \"network-check-target-c78tw\" (UID: \"c89f7bd4-8433-4357-856e-4886a97cdf70\") " pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631115 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-sysctl-conf\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-run-ovn\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e330e52-07ab-4173-a692-bcf1bedd06ff-ovnkube-script-lib\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-log-socket\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631190 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-run\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-run-netns\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.631462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e27e48da-a6dc-4e84-87f4-01916a11e065-host\") pod \"node-ca-tsg7q\" (UID: \"e27e48da-a6dc-4e84-87f4-01916a11e065\") " pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.632107 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sj6l\" (UniqueName: \"kubernetes.io/projected/0d47b69f-123e-49bb-8517-d2e2716ccea1-kube-api-access-7sj6l\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.632107 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-slash\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.632107 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-cni-bin\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.632107 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.632107 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-systemd\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.632107 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.631400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5flhz\" (UniqueName: \"kubernetes.io/projected/c119f984-3b17-49d4-8d0b-37669cbcbeb7-kube-api-access-5flhz\") pod \"iptables-alerter-c9tx6\" (UID: \"c119f984-3b17-49d4-8d0b-37669cbcbeb7\") " pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.653100 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.653069 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:30 +0000 UTC" deadline="2027-09-09 14:54:25.687677835 +0000 UTC" Apr 16 14:52:31.653204 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.653101 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12264h1m54.034581454s" Apr 16 14:52:31.720345 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.720321 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:31.732488 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-systemd\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.732606 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5flhz\" (UniqueName: \"kubernetes.io/projected/c119f984-3b17-49d4-8d0b-37669cbcbeb7-kube-api-access-5flhz\") pod \"iptables-alerter-c9tx6\" (UID: \"c119f984-3b17-49d4-8d0b-37669cbcbeb7\") " pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.732606 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.732606 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-modprobe-d\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.732606 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e330e52-07ab-4173-a692-bcf1bedd06ff-ovnkube-config\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.732606 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-systemd\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.732606 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3323004a-60ab-45da-8ce1-47a7a8622df4-agent-certs\") pod \"konnectivity-agent-5nclj\" (UID: \"3323004a-60ab-45da-8ce1-47a7a8622df4\") " pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-tuned\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-run-openvswitch\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e330e52-07ab-4173-a692-bcf1bedd06ff-env-overrides\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-modprobe-d\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7df5be5d-8f4e-489e-95da-488d3220a4f7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-kubernetes\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-sys\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e330e52-07ab-4173-a692-bcf1bedd06ff-ovn-node-metrics-cert\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-socket-dir-parent\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7df5be5d-8f4e-489e-95da-488d3220a4f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.732935 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dx2mz\" (UniqueName: \"kubernetes.io/projected/7df5be5d-8f4e-489e-95da-488d3220a4f7-kube-api-access-dx2mz\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-sysconfig\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-system-cni-dir\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-run-k8s-cni-cncf-io\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.732993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-os-release\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-run-systemd\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-kubernetes\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-etc-openvswitch\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-cni-netd\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-sys\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-sys-fs\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-conf-dir\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-etc-kubernetes\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-sysctl-d\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7df5be5d-8f4e-489e-95da-488d3220a4f7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.733432 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-socket-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-registration-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-run-openvswitch\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-cnibin\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-var-lib-openvswitch\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733430 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-sysconfig\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733524 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-etc-openvswitch\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e330e52-07ab-4173-a692-bcf1bedd06ff-ovnkube-config\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkhrz\" (UniqueName: \"kubernetes.io/projected/3cebcaa8-957d-4f1e-b4f8-90637dae2bc0-kube-api-access-kkhrz\") pod \"node-resolver-s9crx\" (UID: \"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0\") " pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-cni-binary-copy\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-sysctl-d\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e330e52-07ab-4173-a692-bcf1bedd06ff-env-overrides\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-cni-netd\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-var-lib-cni-bin\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-var-lib-kubelet\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.734227 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.733998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-var-lib-openvswitch\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8hgn\" (UniqueName: \"kubernetes.io/projected/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-kube-api-access-j8hgn\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-run-systemd\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-os-release\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-system-cni-dir\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-system-cni-dir\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-lib-modules\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-host\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-host\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-lib-modules\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-systemd-units\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7df5be5d-8f4e-489e-95da-488d3220a4f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-systemd-units\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54g89\" (UniqueName: \"kubernetes.io/projected/8e330e52-07ab-4173-a692-bcf1bedd06ff-kube-api-access-54g89\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3323004a-60ab-45da-8ce1-47a7a8622df4-konnectivity-ca\") pod \"konnectivity-agent-5nclj\" (UID: \"3323004a-60ab-45da-8ce1-47a7a8622df4\") " pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3cebcaa8-957d-4f1e-b4f8-90637dae2bc0-tmp-dir\") pod \"node-resolver-s9crx\" (UID: \"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0\") " pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-run-multus-certs\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.735646 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-cnibin\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d47b69f-123e-49bb-8517-d2e2716ccea1-tmp\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c119f984-3b17-49d4-8d0b-37669cbcbeb7-iptables-alerter-script\") pod \"iptables-alerter-c9tx6\" (UID: \"c119f984-3b17-49d4-8d0b-37669cbcbeb7\") " pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3cebcaa8-957d-4f1e-b4f8-90637dae2bc0-hosts-file\") pod \"node-resolver-s9crx\" (UID: \"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0\") " pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzs4p\" (UniqueName: \"kubernetes.io/projected/e27e48da-a6dc-4e84-87f4-01916a11e065-kube-api-access-qzs4p\") pod \"node-ca-tsg7q\" (UID: \"e27e48da-a6dc-4e84-87f4-01916a11e065\") " pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-device-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3323004a-60ab-45da-8ce1-47a7a8622df4-konnectivity-ca\") pod \"konnectivity-agent-5nclj\" (UID: \"3323004a-60ab-45da-8ce1-47a7a8622df4\") " pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3cebcaa8-957d-4f1e-b4f8-90637dae2bc0-tmp-dir\") pod \"node-resolver-s9crx\" (UID: \"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0\") " pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-run-netns\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7df5be5d-8f4e-489e-95da-488d3220a4f7-cnibin\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.734935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3cebcaa8-957d-4f1e-b4f8-90637dae2bc0-hosts-file\") pod \"node-resolver-s9crx\" (UID: \"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0\") " pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735127 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-var-lib-cni-multus\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-kubelet\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-node-log\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-cni-dir\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-kubelet\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-daemon-config\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.736471 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7df5be5d-8f4e-489e-95da-488d3220a4f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-var-lib-kubelet\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735306 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-node-log\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e27e48da-a6dc-4e84-87f4-01916a11e065-serviceca\") pod \"node-ca-tsg7q\" (UID: \"e27e48da-a6dc-4e84-87f4-01916a11e065\") " pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srmqq\" (UniqueName: \"kubernetes.io/projected/15399b8b-5282-4eec-bec5-53678c45226f-kube-api-access-srmqq\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-hostroot\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8t6r\" (UniqueName: \"kubernetes.io/projected/fc2d28ab-f651-462e-ae85-98e9780905b0-kube-api-access-x8t6r\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c119f984-3b17-49d4-8d0b-37669cbcbeb7-host-slash\") pod \"iptables-alerter-c9tx6\" (UID: \"c119f984-3b17-49d4-8d0b-37669cbcbeb7\") " pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlhb\" (UniqueName: \"kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb\") pod \"network-check-target-c78tw\" (UID: \"c89f7bd4-8433-4357-856e-4886a97cdf70\") " pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e27e48da-a6dc-4e84-87f4-01916a11e065-serviceca\") pod \"node-ca-tsg7q\" (UID: \"e27e48da-a6dc-4e84-87f4-01916a11e065\") " pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-var-lib-kubelet\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-sysctl-conf\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7df5be5d-8f4e-489e-95da-488d3220a4f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c119f984-3b17-49d4-8d0b-37669cbcbeb7-iptables-alerter-script\") pod \"iptables-alerter-c9tx6\" (UID: \"c119f984-3b17-49d4-8d0b-37669cbcbeb7\") " pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-run-ovn\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e330e52-07ab-4173-a692-bcf1bedd06ff-ovnkube-script-lib\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737244 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-run-ovn\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735906 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-sysctl-conf\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-log-socket\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-log-socket\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-os-release\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-run\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-run-netns\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e27e48da-a6dc-4e84-87f4-01916a11e065-host\") pod \"node-ca-tsg7q\" (UID: \"e27e48da-a6dc-4e84-87f4-01916a11e065\") " pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.735845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c119f984-3b17-49d4-8d0b-37669cbcbeb7-host-slash\") pod \"iptables-alerter-c9tx6\" (UID: \"c119f984-3b17-49d4-8d0b-37669cbcbeb7\") " pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sj6l\" (UniqueName: \"kubernetes.io/projected/0d47b69f-123e-49bb-8517-d2e2716ccea1-kube-api-access-7sj6l\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736143 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d47b69f-123e-49bb-8517-d2e2716ccea1-run\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-slash\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-cni-bin\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736213 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-etc-selinux\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-run-netns\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-slash\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-cni-bin\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.737811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e330e52-07ab-4173-a692-bcf1bedd06ff-ovnkube-script-lib\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.738348 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e330e52-07ab-4173-a692-bcf1bedd06ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.738348 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.736443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e27e48da-a6dc-4e84-87f4-01916a11e065-host\") pod \"node-ca-tsg7q\" (UID: \"e27e48da-a6dc-4e84-87f4-01916a11e065\") " pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.738348 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.737001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0d47b69f-123e-49bb-8517-d2e2716ccea1-etc-tuned\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.738348 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.737255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3323004a-60ab-45da-8ce1-47a7a8622df4-agent-certs\") pod \"konnectivity-agent-5nclj\" (UID: \"3323004a-60ab-45da-8ce1-47a7a8622df4\") " pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:31.738348 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.737491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e330e52-07ab-4173-a692-bcf1bedd06ff-ovn-node-metrics-cert\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.738562 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.738521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d47b69f-123e-49bb-8517-d2e2716ccea1-tmp\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.742678 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:31.742655 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:31.742794 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:31.742685 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:31.742794 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:31.742698 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ldlhb for pod openshift-network-diagnostics/network-check-target-c78tw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:31.742911 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:31.742800 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb podName:c89f7bd4-8433-4357-856e-4886a97cdf70 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.242753097 +0000 UTC m=+3.161114281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ldlhb" (UniqueName: "kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb") pod "network-check-target-c78tw" (UID: "c89f7bd4-8433-4357-856e-4886a97cdf70") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:31.745532 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.745454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkhrz\" (UniqueName: \"kubernetes.io/projected/3cebcaa8-957d-4f1e-b4f8-90637dae2bc0-kube-api-access-kkhrz\") pod \"node-resolver-s9crx\" (UID: \"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0\") " pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.746387 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.746347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54g89\" (UniqueName: \"kubernetes.io/projected/8e330e52-07ab-4173-a692-bcf1bedd06ff-kube-api-access-54g89\") pod \"ovnkube-node-r7dqw\" (UID: \"8e330e52-07ab-4173-a692-bcf1bedd06ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.746524 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.746503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzs4p\" (UniqueName: \"kubernetes.io/projected/e27e48da-a6dc-4e84-87f4-01916a11e065-kube-api-access-qzs4p\") pod \"node-ca-tsg7q\" (UID: \"e27e48da-a6dc-4e84-87f4-01916a11e065\") " pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.747058 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.746971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5flhz\" (UniqueName: \"kubernetes.io/projected/c119f984-3b17-49d4-8d0b-37669cbcbeb7-kube-api-access-5flhz\") pod \"iptables-alerter-c9tx6\" (UID: \"c119f984-3b17-49d4-8d0b-37669cbcbeb7\") " pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.747058 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.747016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx2mz\" (UniqueName: \"kubernetes.io/projected/7df5be5d-8f4e-489e-95da-488d3220a4f7-kube-api-access-dx2mz\") pod \"multus-additional-cni-plugins-lb2nq\" (UID: \"7df5be5d-8f4e-489e-95da-488d3220a4f7\") " pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.748339 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.748298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sj6l\" (UniqueName: \"kubernetes.io/projected/0d47b69f-123e-49bb-8517-d2e2716ccea1-kube-api-access-7sj6l\") pod \"tuned-vj7zr\" (UID: \"0d47b69f-123e-49bb-8517-d2e2716ccea1\") " pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.837508 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.837508 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-socket-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-registration-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-cnibin\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-registration-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-cni-binary-copy\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-socket-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-var-lib-cni-bin\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-cnibin\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-var-lib-kubelet\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.837740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8hgn\" (UniqueName: \"kubernetes.io/projected/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-kube-api-access-j8hgn\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-run-multus-certs\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-var-lib-kubelet\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-var-lib-cni-bin\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-device-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-run-multus-certs\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-run-netns\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-var-lib-cni-multus\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-run-netns\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-var-lib-cni-multus\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-cni-dir\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-daemon-config\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-device-dir\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srmqq\" (UniqueName: \"kubernetes.io/projected/15399b8b-5282-4eec-bec5-53678c45226f-kube-api-access-srmqq\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.837993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-hostroot\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8t6r\" (UniqueName: \"kubernetes.io/projected/fc2d28ab-f651-462e-ae85-98e9780905b0-kube-api-access-x8t6r\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838088 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-hostroot\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838206 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-os-release\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-cni-binary-copy\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-etc-selinux\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-cni-dir\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-socket-dir-parent\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-system-cni-dir\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-etc-selinux\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-run-k8s-cni-cncf-io\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-sys-fs\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-conf-dir\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-os-release\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:31.838351 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-socket-dir-parent\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/15399b8b-5282-4eec-bec5-53678c45226f-sys-fs\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:31.838410 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs podName:fc2d28ab-f651-462e-ae85-98e9780905b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.338395753 +0000 UTC m=+3.256756956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs") pod "network-metrics-daemon-7wx6z" (UID: "fc2d28ab-f651-462e-ae85-98e9780905b0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-host-run-k8s-cni-cncf-io\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-system-cni-dir\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-etc-kubernetes\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.838893 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-etc-kubernetes\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.839468 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-conf-dir\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.839468 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.838575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-multus-daemon-config\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.846133 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.846078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8hgn\" (UniqueName: \"kubernetes.io/projected/a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56-kube-api-access-j8hgn\") pod \"multus-q29wv\" (UID: \"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56\") " pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.846350 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.846316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8t6r\" (UniqueName: \"kubernetes.io/projected/fc2d28ab-f651-462e-ae85-98e9780905b0-kube-api-access-x8t6r\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:31.846435 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.846419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srmqq\" (UniqueName: \"kubernetes.io/projected/15399b8b-5282-4eec-bec5-53678c45226f-kube-api-access-srmqq\") pod \"aws-ebs-csi-driver-node-2xddz\" (UID: \"15399b8b-5282-4eec-bec5-53678c45226f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:31.918385 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.918344 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" Apr 16 14:52:31.929524 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.929504 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c9tx6" Apr 16 14:52:31.939960 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.939939 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:31.944458 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.944438 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:31.951042 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.951021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:31.958569 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.958547 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" Apr 16 14:52:31.966108 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.966091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s9crx" Apr 16 14:52:31.973776 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.973743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tsg7q" Apr 16 14:52:31.982404 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.982385 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q29wv" Apr 16 14:52:31.986938 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:31.986919 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" Apr 16 14:52:32.342174 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.342087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:32.342174 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.342133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlhb\" (UniqueName: \"kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb\") pod \"network-check-target-c78tw\" (UID: \"c89f7bd4-8433-4357-856e-4886a97cdf70\") " pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:32.342391 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:32.342245 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.342391 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:32.342289 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:32.342391 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:32.342306 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:32.342391 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:32.342312 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs podName:fc2d28ab-f651-462e-ae85-98e9780905b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.342290625 +0000 UTC m=+4.260651809 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs") pod "network-metrics-daemon-7wx6z" (UID: "fc2d28ab-f651-462e-ae85-98e9780905b0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.342391 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:32.342317 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ldlhb for pod openshift-network-diagnostics/network-check-target-c78tw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:32.342391 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:32.342367 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb podName:c89f7bd4-8433-4357-856e-4886a97cdf70 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.342351475 +0000 UTC m=+4.260712678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldlhb" (UniqueName: "kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb") pod "network-check-target-c78tw" (UID: "c89f7bd4-8433-4357-856e-4886a97cdf70") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:32.467308 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:32.467277 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e330e52_07ab_4173_a692_bcf1bedd06ff.slice/crio-92ef4c20d0db099820389a77aa97df97f77b35f192fb4be3bb948f63e9aeed08 WatchSource:0}: Error finding container 92ef4c20d0db099820389a77aa97df97f77b35f192fb4be3bb948f63e9aeed08: Status 404 returned error can't find the container with id 92ef4c20d0db099820389a77aa97df97f77b35f192fb4be3bb948f63e9aeed08 Apr 16 14:52:32.468460 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:32.468432 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3323004a_60ab_45da_8ce1_47a7a8622df4.slice/crio-88311335bf743bb03f05aafe575edd04cacfac6eb0a9a17ae2762f85c6f4bbde WatchSource:0}: Error finding container 88311335bf743bb03f05aafe575edd04cacfac6eb0a9a17ae2762f85c6f4bbde: Status 404 returned error can't find the container with id 88311335bf743bb03f05aafe575edd04cacfac6eb0a9a17ae2762f85c6f4bbde Apr 16 14:52:32.469982 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:32.469958 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cebcaa8_957d_4f1e_b4f8_90637dae2bc0.slice/crio-5e1bd1c5a0daa8ae26a01dd44d89f2a349f58857d001eee47f16e2d6869ee747 WatchSource:0}: Error finding container 5e1bd1c5a0daa8ae26a01dd44d89f2a349f58857d001eee47f16e2d6869ee747: Status 404 returned error can't find the container with id 5e1bd1c5a0daa8ae26a01dd44d89f2a349f58857d001eee47f16e2d6869ee747 Apr 16 14:52:32.472165 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:32.472138 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc119f984_3b17_49d4_8d0b_37669cbcbeb7.slice/crio-ab6afb54b5bfe3cffbc67814535842f0b9d8f11d05bebd21ddc555b84fd6ac64 WatchSource:0}: Error finding container ab6afb54b5bfe3cffbc67814535842f0b9d8f11d05bebd21ddc555b84fd6ac64: Status 404 returned error can't find the container with id ab6afb54b5bfe3cffbc67814535842f0b9d8f11d05bebd21ddc555b84fd6ac64 Apr 16 14:52:32.473511 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:32.473487 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode27e48da_a6dc_4e84_87f4_01916a11e065.slice/crio-348fcbbb6bc083bfd57974141d1bac3b6ac8ab753028a4afb53cc4b29ac1036e WatchSource:0}: Error finding container 348fcbbb6bc083bfd57974141d1bac3b6ac8ab753028a4afb53cc4b29ac1036e: Status 404 returned error can't find the container with id 348fcbbb6bc083bfd57974141d1bac3b6ac8ab753028a4afb53cc4b29ac1036e Apr 16 14:52:32.474193 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:32.474170 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9a03c7e_3f17_4a7e_b126_8b8ba1c33c56.slice/crio-c52c65924e93566f5a17a7b35fdb26525cd18e5f71aeecb4f76016978fce12c2 WatchSource:0}: Error finding container c52c65924e93566f5a17a7b35fdb26525cd18e5f71aeecb4f76016978fce12c2: Status 404 returned error can't find the container with id c52c65924e93566f5a17a7b35fdb26525cd18e5f71aeecb4f76016978fce12c2 Apr 16 14:52:32.475321 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:32.475022 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d47b69f_123e_49bb_8517_d2e2716ccea1.slice/crio-1f02de4cf6eb1b4077e9266babcdb105cf9c9cfc62f71683bfbe954a336c6950 WatchSource:0}: Error finding container 1f02de4cf6eb1b4077e9266babcdb105cf9c9cfc62f71683bfbe954a336c6950: Status 404 returned error can't find the container with id 1f02de4cf6eb1b4077e9266babcdb105cf9c9cfc62f71683bfbe954a336c6950 Apr 16 14:52:32.476220 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:32.476132 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15399b8b_5282_4eec_bec5_53678c45226f.slice/crio-9c181ff528d55ac7f8c3446a522962798f1ff67c67df63a6384d4456efcd9f17 WatchSource:0}: Error finding container 9c181ff528d55ac7f8c3446a522962798f1ff67c67df63a6384d4456efcd9f17: Status 404 returned error can't find the container with id 9c181ff528d55ac7f8c3446a522962798f1ff67c67df63a6384d4456efcd9f17 Apr 16 14:52:32.477404 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:52:32.477365 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7df5be5d_8f4e_489e_95da_488d3220a4f7.slice/crio-788ae0127711a8c636e67350d552efdd6ff4a1843052a48681bbdb02c9af70fb WatchSource:0}: Error finding container 788ae0127711a8c636e67350d552efdd6ff4a1843052a48681bbdb02c9af70fb: Status 404 returned error can't find the container with id 788ae0127711a8c636e67350d552efdd6ff4a1843052a48681bbdb02c9af70fb Apr 16 14:52:32.654329 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.654132 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:30 +0000 UTC" deadline="2027-12-08 19:53:05.779114589 +0000 UTC" Apr 16 14:52:32.654329 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.654271 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14429h0m33.124846171s" Apr 16 14:52:32.812625 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.811366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal" event={"ID":"8e5adc7c4d5729c56cf31736cd00ffbf","Type":"ContainerStarted","Data":"db8c441438c6305127612c53edea327bd1bff777e055f408492a43a5060fa5c9"} Apr 16 14:52:32.814224 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.814193 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q29wv" event={"ID":"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56","Type":"ContainerStarted","Data":"c52c65924e93566f5a17a7b35fdb26525cd18e5f71aeecb4f76016978fce12c2"} Apr 16 14:52:32.818004 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.817502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerStarted","Data":"788ae0127711a8c636e67350d552efdd6ff4a1843052a48681bbdb02c9af70fb"} Apr 16 14:52:32.819030 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.818993 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c9tx6" event={"ID":"c119f984-3b17-49d4-8d0b-37669cbcbeb7","Type":"ContainerStarted","Data":"ab6afb54b5bfe3cffbc67814535842f0b9d8f11d05bebd21ddc555b84fd6ac64"} Apr 16 14:52:32.820547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.820511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerStarted","Data":"92ef4c20d0db099820389a77aa97df97f77b35f192fb4be3bb948f63e9aeed08"} Apr 16 14:52:32.821843 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.821804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tsg7q" event={"ID":"e27e48da-a6dc-4e84-87f4-01916a11e065","Type":"ContainerStarted","Data":"348fcbbb6bc083bfd57974141d1bac3b6ac8ab753028a4afb53cc4b29ac1036e"} Apr 16 14:52:32.824639 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.824585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" event={"ID":"15399b8b-5282-4eec-bec5-53678c45226f","Type":"ContainerStarted","Data":"9c181ff528d55ac7f8c3446a522962798f1ff67c67df63a6384d4456efcd9f17"} Apr 16 14:52:32.826414 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.826391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" event={"ID":"0d47b69f-123e-49bb-8517-d2e2716ccea1","Type":"ContainerStarted","Data":"1f02de4cf6eb1b4077e9266babcdb105cf9c9cfc62f71683bfbe954a336c6950"} Apr 16 14:52:32.828175 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.828157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s9crx" event={"ID":"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0","Type":"ContainerStarted","Data":"5e1bd1c5a0daa8ae26a01dd44d89f2a349f58857d001eee47f16e2d6869ee747"} Apr 16 14:52:32.829311 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:32.829288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5nclj" event={"ID":"3323004a-60ab-45da-8ce1-47a7a8622df4","Type":"ContainerStarted","Data":"88311335bf743bb03f05aafe575edd04cacfac6eb0a9a17ae2762f85c6f4bbde"} Apr 16 14:52:33.352051 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:33.352005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:33.352309 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:33.352067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlhb\" (UniqueName: \"kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb\") pod \"network-check-target-c78tw\" (UID: \"c89f7bd4-8433-4357-856e-4886a97cdf70\") " pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:33.352309 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:33.352198 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.352309 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:33.352209 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:33.352309 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:33.352223 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:33.352309 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:33.352235 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ldlhb for pod openshift-network-diagnostics/network-check-target-c78tw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.352309 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:33.352265 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs podName:fc2d28ab-f651-462e-ae85-98e9780905b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:35.352244562 +0000 UTC m=+6.270605759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs") pod "network-metrics-daemon-7wx6z" (UID: "fc2d28ab-f651-462e-ae85-98e9780905b0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.352309 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:33.352280 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb podName:c89f7bd4-8433-4357-856e-4886a97cdf70 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:35.352273485 +0000 UTC m=+6.270634668 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldlhb" (UniqueName: "kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb") pod "network-check-target-c78tw" (UID: "c89f7bd4-8433-4357-856e-4886a97cdf70") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.802049 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:33.801972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:33.802561 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:33.802106 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:33.802561 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:33.802520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:33.802686 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:33.802605 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:33.868164 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:33.868098 2576 generic.go:358] "Generic (PLEG): container finished" podID="13d086a02172e63757658cc95fdeac2d" containerID="a12004923aa9187f8f47669e16af384c31ab60ea7201a1b55ad23b440bb261c9" exitCode=0 Apr 16 14:52:33.868319 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:33.868241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" event={"ID":"13d086a02172e63757658cc95fdeac2d","Type":"ContainerDied","Data":"a12004923aa9187f8f47669e16af384c31ab60ea7201a1b55ad23b440bb261c9"} Apr 16 14:52:33.889681 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:33.889427 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-140.ec2.internal" podStartSLOduration=2.889408485 podStartE2EDuration="2.889408485s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:32.823001176 +0000 UTC m=+3.741362384" watchObservedRunningTime="2026-04-16 14:52:33.889408485 +0000 UTC m=+4.807769692" Apr 16 14:52:34.880755 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:34.880720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" event={"ID":"13d086a02172e63757658cc95fdeac2d","Type":"ContainerStarted","Data":"71a5ebbde95cfe4e45aa050d4761d0a2bac10f18a9c1497216ce7964bb72ce6e"} Apr 16 14:52:35.369848 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:35.369753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:35.369848 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:35.369823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlhb\" (UniqueName: \"kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb\") pod \"network-check-target-c78tw\" (UID: \"c89f7bd4-8433-4357-856e-4886a97cdf70\") " pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:35.370052 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:35.369995 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:35.370052 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:35.370016 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:35.370052 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:35.370029 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ldlhb for pod openshift-network-diagnostics/network-check-target-c78tw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:35.370253 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:35.370088 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb podName:c89f7bd4-8433-4357-856e-4886a97cdf70 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:39.370068518 +0000 UTC m=+10.288429718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldlhb" (UniqueName: "kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb") pod "network-check-target-c78tw" (UID: "c89f7bd4-8433-4357-856e-4886a97cdf70") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:35.370499 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:35.370480 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:35.370565 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:35.370532 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs podName:fc2d28ab-f651-462e-ae85-98e9780905b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:39.370519041 +0000 UTC m=+10.288880229 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs") pod "network-metrics-daemon-7wx6z" (UID: "fc2d28ab-f651-462e-ae85-98e9780905b0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:35.802085 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:35.801339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:35.802085 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:35.801468 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:35.802085 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:35.801884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:35.802085 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:35.801987 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:37.801993 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:37.801455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:37.801993 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:37.801502 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:37.801993 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:37.801609 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:37.801993 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:37.801716 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:39.403933 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:39.403893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:39.404393 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:39.403955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlhb\" (UniqueName: \"kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb\") pod \"network-check-target-c78tw\" (UID: \"c89f7bd4-8433-4357-856e-4886a97cdf70\") " pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:39.404393 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:39.404030 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:39.404393 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:39.404107 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs podName:fc2d28ab-f651-462e-ae85-98e9780905b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.404086874 +0000 UTC m=+18.322448073 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs") pod "network-metrics-daemon-7wx6z" (UID: "fc2d28ab-f651-462e-ae85-98e9780905b0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:39.404393 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:39.404120 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:39.404393 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:39.404143 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:39.404393 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:39.404156 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ldlhb for pod openshift-network-diagnostics/network-check-target-c78tw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:39.404393 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:39.404208 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb podName:c89f7bd4-8433-4357-856e-4886a97cdf70 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.404191596 +0000 UTC m=+18.322552790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldlhb" (UniqueName: "kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb") pod "network-check-target-c78tw" (UID: "c89f7bd4-8433-4357-856e-4886a97cdf70") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:39.802647 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:39.802616 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:39.802798 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:39.802753 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:39.802860 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:39.802819 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:39.802963 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:39.802938 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:41.801908 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:41.801822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:41.802341 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:41.801959 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:41.802449 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:41.802424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:41.802576 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:41.802528 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:43.802198 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:43.802161 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:43.802637 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:43.802161 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:43.802637 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:43.802311 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:43.802637 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:43.802387 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:45.801132 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:45.801098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:45.801585 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:45.801150 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:45.801585 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:45.801231 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:45.801585 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:45.801371 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:47.463116 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:47.463078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:47.463116 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:47.463123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlhb\" (UniqueName: \"kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb\") pod \"network-check-target-c78tw\" (UID: \"c89f7bd4-8433-4357-856e-4886a97cdf70\") " pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:47.463816 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:47.463243 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.463816 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:47.463328 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs podName:fc2d28ab-f651-462e-ae85-98e9780905b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.463307118 +0000 UTC m=+34.381668313 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs") pod "network-metrics-daemon-7wx6z" (UID: "fc2d28ab-f651-462e-ae85-98e9780905b0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.463816 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:47.463251 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:47.463816 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:47.463364 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:47.463816 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:47.463377 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ldlhb for pod openshift-network-diagnostics/network-check-target-c78tw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:47.463816 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:47.463430 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb podName:c89f7bd4-8433-4357-856e-4886a97cdf70 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.463416031 +0000 UTC m=+34.381777216 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldlhb" (UniqueName: "kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb") pod "network-check-target-c78tw" (UID: "c89f7bd4-8433-4357-856e-4886a97cdf70") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:47.801278 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:47.801195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:47.801498 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:47.801199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:47.801498 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:47.801300 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:47.801498 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:47.801411 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:49.803522 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.803376 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:49.803971 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.803490 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:49.803971 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:49.803632 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:49.803971 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:49.803695 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:49.908950 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.908921 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q29wv" event={"ID":"a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56","Type":"ContainerStarted","Data":"27f57125acead7544bb1c4eeb1847ec99300ab0ce9c164d08f7cbb1f27310c55"} Apr 16 14:52:49.910355 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.910326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerStarted","Data":"3279abdff472968705eb065660a313f6b2f6e1bbb261d9b4c7ce174d87ebb32a"} Apr 16 14:52:49.911732 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.911707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerStarted","Data":"8dab6584f0757bbd76811fdbca42b68732a71ddc373e330a8c741d8f2fe6dad0"} Apr 16 14:52:49.913082 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.913060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tsg7q" event={"ID":"e27e48da-a6dc-4e84-87f4-01916a11e065","Type":"ContainerStarted","Data":"9dc682d02cc267103b51910deb424194c313d0cc84771ae1d32c3f67ae132d55"} Apr 16 14:52:49.915000 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.914980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" event={"ID":"15399b8b-5282-4eec-bec5-53678c45226f","Type":"ContainerStarted","Data":"d7204b22696a0d37e6d3a75c440179e8239ea56d094c6e2a84fea9a3045f09d0"} Apr 16 14:52:49.916439 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.916365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" event={"ID":"0d47b69f-123e-49bb-8517-d2e2716ccea1","Type":"ContainerStarted","Data":"5dd94ae74a3a35ae4f7c8753904530f17cb65561f199c034dd7fc57efe3d1f3d"} Apr 16 14:52:49.917871 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.917848 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s9crx" event={"ID":"3cebcaa8-957d-4f1e-b4f8-90637dae2bc0","Type":"ContainerStarted","Data":"d3c7947a211317fc90d1dff7bfa434476f730502ddf4a3f7e1db9fa033407472"} Apr 16 14:52:49.919399 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.919322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5nclj" event={"ID":"3323004a-60ab-45da-8ce1-47a7a8622df4","Type":"ContainerStarted","Data":"b4b03d7991c10ff303ee8a29a11c1058a4c494c5ad60ced34a0f70cb8e9ecb18"} Apr 16 14:52:49.929449 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.929402 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-140.ec2.internal" podStartSLOduration=18.929362235 podStartE2EDuration="18.929362235s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:34.894483888 +0000 UTC m=+5.812845094" watchObservedRunningTime="2026-04-16 14:52:49.929362235 +0000 UTC m=+20.847723420" Apr 16 14:52:49.929564 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.929535 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q29wv" podStartSLOduration=3.86176609 podStartE2EDuration="20.929525911s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.4766531 +0000 UTC m=+3.395014296" lastFinishedPulling="2026-04-16 14:52:49.54441292 +0000 UTC m=+20.462774117" observedRunningTime="2026-04-16 14:52:49.929331908 +0000 UTC m=+20.847693113" watchObservedRunningTime="2026-04-16 14:52:49.929525911 +0000 UTC m=+20.847887117" Apr 16 14:52:49.943523 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.943470 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5nclj" podStartSLOduration=11.996589495 podStartE2EDuration="20.943451951s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.470602703 +0000 UTC m=+3.388963900" lastFinishedPulling="2026-04-16 14:52:41.417465158 +0000 UTC m=+12.335826356" observedRunningTime="2026-04-16 14:52:49.942563122 +0000 UTC m=+20.860924327" watchObservedRunningTime="2026-04-16 14:52:49.943451951 +0000 UTC m=+20.861813156" Apr 16 14:52:49.985342 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.985298 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s9crx" podStartSLOduration=4.299390278 podStartE2EDuration="20.985284048s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.472048812 +0000 UTC m=+3.390410011" lastFinishedPulling="2026-04-16 14:52:49.157942588 +0000 UTC m=+20.076303781" observedRunningTime="2026-04-16 14:52:49.971308441 +0000 UTC m=+20.889669647" watchObservedRunningTime="2026-04-16 14:52:49.985284048 +0000 UTC m=+20.903645276" Apr 16 14:52:49.985469 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:49.985369 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tsg7q" podStartSLOduration=4.302658744 podStartE2EDuration="20.9853652s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.475234618 +0000 UTC m=+3.393595815" lastFinishedPulling="2026-04-16 14:52:49.157941084 +0000 UTC m=+20.076302271" observedRunningTime="2026-04-16 14:52:49.984960106 +0000 UTC m=+20.903321312" watchObservedRunningTime="2026-04-16 14:52:49.9853652 +0000 UTC m=+20.903726405" Apr 16 14:52:50.000984 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.000932 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vj7zr" podStartSLOduration=3.944958787 podStartE2EDuration="21.00091302s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.477600185 +0000 UTC m=+3.395961578" lastFinishedPulling="2026-04-16 14:52:49.533554614 +0000 UTC m=+20.451915811" observedRunningTime="2026-04-16 14:52:50.000339443 +0000 UTC m=+20.918700674" watchObservedRunningTime="2026-04-16 14:52:50.00091302 +0000 UTC m=+20.919274227" Apr 16 14:52:50.757009 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.756787 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:50.757343 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.757328 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:50.922653 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.922582 2576 generic.go:358] "Generic (PLEG): container finished" podID="7df5be5d-8f4e-489e-95da-488d3220a4f7" containerID="3279abdff472968705eb065660a313f6b2f6e1bbb261d9b4c7ce174d87ebb32a" exitCode=0 Apr 16 14:52:50.922653 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.922640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerDied","Data":"3279abdff472968705eb065660a313f6b2f6e1bbb261d9b4c7ce174d87ebb32a"} Apr 16 14:52:50.923895 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.923875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c9tx6" event={"ID":"c119f984-3b17-49d4-8d0b-37669cbcbeb7","Type":"ContainerStarted","Data":"24deaf1550f7ec6c3c9e706b92b5b27ba7c95e968e71a871d04c024715c7af09"} Apr 16 14:52:50.926180 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.926163 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 14:52:50.926444 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.926425 2576 generic.go:358] "Generic (PLEG): container finished" podID="8e330e52-07ab-4173-a692-bcf1bedd06ff" containerID="8b15821e49d25ba9a977b3c4d6be9f76c9b6493d2404dc6c9ba3bbe12c82d071" exitCode=1 Apr 16 14:52:50.926544 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.926514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerStarted","Data":"c67d5508d181ac6fd549f8ad2ebc97d59eb30b506383aa0ec3ce44a564f1d43e"} Apr 16 14:52:50.926596 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.926550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerStarted","Data":"aae229178e376203339cae1bb6727e147889465771dcccaa2583b70457e05772"} Apr 16 14:52:50.926596 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.926565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerStarted","Data":"d16ea581cbb53e8136e5741908f5ca225e045a4b9bb055ba73f0e9e281a93c49"} Apr 16 14:52:50.926596 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.926577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerStarted","Data":"6abad0663f1f64559d62ebc0053e809620ba57a3c7a12aba832ff3fd7cac89fb"} Apr 16 14:52:50.926596 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.926586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerDied","Data":"8b15821e49d25ba9a977b3c4d6be9f76c9b6493d2404dc6c9ba3bbe12c82d071"} Apr 16 14:52:50.927218 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.927203 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:50.927674 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.927658 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5nclj" Apr 16 14:52:50.955559 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:50.955523 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-c9tx6" podStartSLOduration=5.271974693 podStartE2EDuration="21.955509722s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.474733023 +0000 UTC m=+3.393094220" lastFinishedPulling="2026-04-16 14:52:49.15826806 +0000 UTC m=+20.076629249" observedRunningTime="2026-04-16 14:52:50.955331578 +0000 UTC m=+21.873692783" watchObservedRunningTime="2026-04-16 14:52:50.955509722 +0000 UTC m=+21.873870928" Apr 16 14:52:51.112173 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:51.112148 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:52:51.708160 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:51.707974 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:52:51.112167862Z","UUID":"31d72543-bd7a-4358-b446-25ffc2392d32","Handler":null,"Name":"","Endpoint":""} Apr 16 14:52:51.710199 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:51.710170 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:52:51.710341 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:51.710205 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:52:51.801685 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:51.801654 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:51.801882 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:51.801654 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:51.801882 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:51.801799 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:51.801882 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:51.801851 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:51.930620 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:51.930579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" event={"ID":"15399b8b-5282-4eec-bec5-53678c45226f","Type":"ContainerStarted","Data":"f3cf22546222da35ee5dc04c9b72e037cd9aedc75738da52893bf37c8cbd4ea8"} Apr 16 14:52:52.935019 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:52.934979 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" event={"ID":"15399b8b-5282-4eec-bec5-53678c45226f","Type":"ContainerStarted","Data":"57654aed994e87585915216b8e4669adc8a27d5122bd758eba24b014a9e59361"} Apr 16 14:52:52.938454 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:52.938415 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 14:52:52.938847 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:52.938821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerStarted","Data":"f10392a42f500048b87a95b789073d02d36ef3ecced7126b8bc05fe17d0b3e47"} Apr 16 14:52:52.949923 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:52.949877 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xddz" podStartSLOduration=4.034947203 podStartE2EDuration="23.949859899s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.478556142 +0000 UTC m=+3.396917326" lastFinishedPulling="2026-04-16 14:52:52.393468833 +0000 UTC m=+23.311830022" observedRunningTime="2026-04-16 14:52:52.948858479 +0000 UTC m=+23.867219684" watchObservedRunningTime="2026-04-16 14:52:52.949859899 +0000 UTC m=+23.868221106" Apr 16 14:52:53.801469 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:53.801232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:53.801640 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:53.801250 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:53.801640 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:53.801582 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:53.801784 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:53.801718 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:55.801837 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.801627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:55.802576 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.801684 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:55.802576 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:55.801910 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:55.802576 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:55.801988 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:55.948663 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.948627 2576 generic.go:358] "Generic (PLEG): container finished" podID="7df5be5d-8f4e-489e-95da-488d3220a4f7" containerID="9b65a75cf1b46504cb7253498e4bfbe3cec2e27a480afabc1c8b71790e8d845e" exitCode=0 Apr 16 14:52:55.948826 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.948684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerDied","Data":"9b65a75cf1b46504cb7253498e4bfbe3cec2e27a480afabc1c8b71790e8d845e"} Apr 16 14:52:55.951842 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.951790 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 14:52:55.952195 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.952170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerStarted","Data":"0bf646ce9bca6f2aab6fdf61ae3df3d65990a37b55a36ed8bb4d20c586703eab"} Apr 16 14:52:55.952504 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.952477 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:55.952581 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.952515 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:55.952581 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.952528 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:55.952661 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.952615 2576 scope.go:117] "RemoveContainer" containerID="8b15821e49d25ba9a977b3c4d6be9f76c9b6493d2404dc6c9ba3bbe12c82d071" Apr 16 14:52:55.968609 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.968588 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:55.968711 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:55.968651 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:52:56.899675 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:56.899648 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-c78tw"] Apr 16 14:52:56.900092 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:56.899786 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:56.900092 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:56.899898 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:52:56.902703 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:56.902678 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7wx6z"] Apr 16 14:52:56.902832 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:56.902812 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:56.902923 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:56.902904 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:56.956423 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:56.956341 2576 generic.go:358] "Generic (PLEG): container finished" podID="7df5be5d-8f4e-489e-95da-488d3220a4f7" containerID="08175ddc27deeefc456838da0a6c67aa2e66406309cf973e6bc76fe652edd4d4" exitCode=0 Apr 16 14:52:56.956578 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:56.956432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerDied","Data":"08175ddc27deeefc456838da0a6c67aa2e66406309cf973e6bc76fe652edd4d4"} Apr 16 14:52:56.960196 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:56.960175 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 14:52:56.960542 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:56.960522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" event={"ID":"8e330e52-07ab-4173-a692-bcf1bedd06ff","Type":"ContainerStarted","Data":"5b6c76a34b2a80254c59e2dbf43240de8194de7ed9dc11b75b845921027612cc"} Apr 16 14:52:56.994706 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:56.994662 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" podStartSLOduration=10.869649532 podStartE2EDuration="27.994648107s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.469260643 +0000 UTC m=+3.387621844" lastFinishedPulling="2026-04-16 14:52:49.594259222 +0000 UTC m=+20.512620419" observedRunningTime="2026-04-16 14:52:56.994150495 +0000 UTC m=+27.912511714" watchObservedRunningTime="2026-04-16 14:52:56.994648107 +0000 UTC m=+27.913009312" Apr 16 14:52:57.965118 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:57.965026 2576 generic.go:358] "Generic (PLEG): container finished" podID="7df5be5d-8f4e-489e-95da-488d3220a4f7" containerID="36f817ed44294dc1d45913a6c4507b95761c98ef6125d2b09c8a597568af5fa8" exitCode=0 Apr 16 14:52:57.965497 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:57.965115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerDied","Data":"36f817ed44294dc1d45913a6c4507b95761c98ef6125d2b09c8a597568af5fa8"} Apr 16 14:52:58.801400 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:58.801366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:52:58.801539 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:58.801507 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:52:58.801600 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:52:58.801366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:52:58.801702 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:52:58.801673 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:53:00.802340 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:00.802152 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:53:00.802845 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:00.802158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:53:00.802845 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:00.802438 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c78tw" podUID="c89f7bd4-8433-4357-856e-4886a97cdf70" Apr 16 14:53:00.802845 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:00.802542 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:53:02.437084 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.437011 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-140.ec2.internal" event="NodeReady" Apr 16 14:53:02.437657 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.437157 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:53:02.475656 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.475622 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wwb2b"] Apr 16 14:53:02.478233 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.478209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.478674 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.478650 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-n8878"] Apr 16 14:53:02.480301 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.480277 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9hwfz\"" Apr 16 14:53:02.480508 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.480282 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:53:02.480508 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.480495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:02.480636 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.480563 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:53:02.482113 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.482089 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cvt8w\"" Apr 16 14:53:02.482447 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.482427 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:53:02.482547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.482506 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:53:02.482547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.482515 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:53:02.486119 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.486099 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wwb2b"] Apr 16 14:53:02.489687 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.489667 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n8878"] Apr 16 14:53:02.570292 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.570255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9pj4\" (UniqueName: \"kubernetes.io/projected/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-kube-api-access-f9pj4\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.570470 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.570304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-config-volume\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.570470 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.570331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twk8h\" (UniqueName: \"kubernetes.io/projected/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-kube-api-access-twk8h\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:02.570470 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.570380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.570470 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.570413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:02.570665 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.570504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-tmp-dir\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.672071 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.672028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-config-volume\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.672242 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.672088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twk8h\" (UniqueName: \"kubernetes.io/projected/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-kube-api-access-twk8h\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:02.672242 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.672119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.672242 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.672164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:02.672392 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:02.672347 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:02.672440 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:02.672414 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert podName:fb790f7c-1dc9-4bf8-a9e6-1054b49e346c nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.17239583 +0000 UTC m=+34.090757034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert") pod "ingress-canary-n8878" (UID: "fb790f7c-1dc9-4bf8-a9e6-1054b49e346c") : secret "canary-serving-cert" not found Apr 16 14:53:02.673302 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:02.672797 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:02.673302 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.672837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-tmp-dir\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.673302 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:02.672866 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls podName:7ede1baa-a6e7-4b5e-8723-94a6c70847e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.172849919 +0000 UTC m=+34.091211104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls") pod "dns-default-wwb2b" (UID: "7ede1baa-a6e7-4b5e-8723-94a6c70847e3") : secret "dns-default-metrics-tls" not found Apr 16 14:53:02.673302 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.672910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9pj4\" (UniqueName: \"kubernetes.io/projected/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-kube-api-access-f9pj4\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.673302 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.673234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-tmp-dir\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.673626 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.673382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-config-volume\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.683977 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.683950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9pj4\" (UniqueName: \"kubernetes.io/projected/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-kube-api-access-f9pj4\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:02.683977 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.683969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twk8h\" (UniqueName: \"kubernetes.io/projected/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-kube-api-access-twk8h\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:02.801611 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.801514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:53:02.801823 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.801520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:53:02.803972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.803950 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:02.804114 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.804075 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fb76v\"" Apr 16 14:53:02.804204 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.804183 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:02.804314 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.804273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:02.804405 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:02.804388 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sk8kg\"" Apr 16 14:53:03.177453 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:03.177374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:03.177620 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:03.177547 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:03.177620 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:03.177560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:03.177735 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:03.177625 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert podName:fb790f7c-1dc9-4bf8-a9e6-1054b49e346c nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.177602625 +0000 UTC m=+35.095963823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert") pod "ingress-canary-n8878" (UID: "fb790f7c-1dc9-4bf8-a9e6-1054b49e346c") : secret "canary-serving-cert" not found Apr 16 14:53:03.177735 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:03.177675 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:03.177735 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:03.177725 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls podName:7ede1baa-a6e7-4b5e-8723-94a6c70847e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.177711078 +0000 UTC m=+35.096072277 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls") pod "dns-default-wwb2b" (UID: "7ede1baa-a6e7-4b5e-8723-94a6c70847e3") : secret "dns-default-metrics-tls" not found Apr 16 14:53:03.479447 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:03.479360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:53:03.479447 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:03.479400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlhb\" (UniqueName: \"kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb\") pod \"network-check-target-c78tw\" (UID: \"c89f7bd4-8433-4357-856e-4886a97cdf70\") " pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:53:03.480077 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:03.479514 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:03.480077 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:03.479582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs podName:fc2d28ab-f651-462e-ae85-98e9780905b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:35.479564303 +0000 UTC m=+66.397925497 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs") pod "network-metrics-daemon-7wx6z" (UID: "fc2d28ab-f651-462e-ae85-98e9780905b0") : secret "metrics-daemon-secret" not found Apr 16 14:53:03.482279 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:03.482245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldlhb\" (UniqueName: \"kubernetes.io/projected/c89f7bd4-8433-4357-856e-4886a97cdf70-kube-api-access-ldlhb\") pod \"network-check-target-c78tw\" (UID: \"c89f7bd4-8433-4357-856e-4886a97cdf70\") " pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:53:03.714340 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:03.714316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:53:03.888431 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:03.888226 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-c78tw"] Apr 16 14:53:03.894695 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:53:03.894663 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89f7bd4_8433_4357_856e_4886a97cdf70.slice/crio-54d5477e96bb805a5d0a703905ea377365d19aeb3bb73d4c42ba8367aecf557b WatchSource:0}: Error finding container 54d5477e96bb805a5d0a703905ea377365d19aeb3bb73d4c42ba8367aecf557b: Status 404 returned error can't find the container with id 54d5477e96bb805a5d0a703905ea377365d19aeb3bb73d4c42ba8367aecf557b Apr 16 14:53:03.980263 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:03.980220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerStarted","Data":"02210be23db656b0d6c3158419ae67137d10a9782aa364935ad506bd87f0b1cc"} Apr 16 14:53:03.981303 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:03.981268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-c78tw" event={"ID":"c89f7bd4-8433-4357-856e-4886a97cdf70","Type":"ContainerStarted","Data":"54d5477e96bb805a5d0a703905ea377365d19aeb3bb73d4c42ba8367aecf557b"} Apr 16 14:53:04.184517 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:04.184486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:04.184707 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:04.184546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:04.184707 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:04.184629 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:04.184707 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:04.184638 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:04.184707 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:04.184698 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls podName:7ede1baa-a6e7-4b5e-8723-94a6c70847e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:06.184683342 +0000 UTC m=+37.103044525 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls") pod "dns-default-wwb2b" (UID: "7ede1baa-a6e7-4b5e-8723-94a6c70847e3") : secret "dns-default-metrics-tls" not found Apr 16 14:53:04.184707 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:04.184712 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert podName:fb790f7c-1dc9-4bf8-a9e6-1054b49e346c nodeName:}" failed. No retries permitted until 2026-04-16 14:53:06.184706243 +0000 UTC m=+37.103067426 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert") pod "ingress-canary-n8878" (UID: "fb790f7c-1dc9-4bf8-a9e6-1054b49e346c") : secret "canary-serving-cert" not found Apr 16 14:53:04.986268 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:04.986231 2576 generic.go:358] "Generic (PLEG): container finished" podID="7df5be5d-8f4e-489e-95da-488d3220a4f7" containerID="02210be23db656b0d6c3158419ae67137d10a9782aa364935ad506bd87f0b1cc" exitCode=0 Apr 16 14:53:04.986845 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:04.986306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerDied","Data":"02210be23db656b0d6c3158419ae67137d10a9782aa364935ad506bd87f0b1cc"} Apr 16 14:53:05.991566 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:05.991536 2576 generic.go:358] "Generic (PLEG): container finished" podID="7df5be5d-8f4e-489e-95da-488d3220a4f7" containerID="e6831e1aa44b9acb79a2c9ae02386ab5651f84b0dda7f6544de934581bbeed48" exitCode=0 Apr 16 14:53:05.992019 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:05.991577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerDied","Data":"e6831e1aa44b9acb79a2c9ae02386ab5651f84b0dda7f6544de934581bbeed48"} Apr 16 14:53:06.200856 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:06.200809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:06.201027 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:06.200878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:06.201027 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:06.200956 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:06.201027 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:06.201022 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls podName:7ede1baa-a6e7-4b5e-8723-94a6c70847e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:10.201003558 +0000 UTC m=+41.119364748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls") pod "dns-default-wwb2b" (UID: "7ede1baa-a6e7-4b5e-8723-94a6c70847e3") : secret "dns-default-metrics-tls" not found Apr 16 14:53:06.201182 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:06.201037 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:06.201182 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:06.201100 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert podName:fb790f7c-1dc9-4bf8-a9e6-1054b49e346c nodeName:}" failed. No retries permitted until 2026-04-16 14:53:10.201082397 +0000 UTC m=+41.119443595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert") pod "ingress-canary-n8878" (UID: "fb790f7c-1dc9-4bf8-a9e6-1054b49e346c") : secret "canary-serving-cert" not found Apr 16 14:53:06.996761 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:06.996732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" event={"ID":"7df5be5d-8f4e-489e-95da-488d3220a4f7","Type":"ContainerStarted","Data":"d5744848b022a868e1f6dc93208c47f2a5da71da296e217e8af6cef1e610fc57"} Apr 16 14:53:06.998077 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:06.998052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-c78tw" event={"ID":"c89f7bd4-8433-4357-856e-4886a97cdf70","Type":"ContainerStarted","Data":"183409ccaec97458054ca288b331e8222309495e18ef333e3021fbcca4d2e002"} Apr 16 14:53:07.016095 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:07.016050 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lb2nq" podStartSLOduration=6.780132059 podStartE2EDuration="38.016035655s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.478914524 +0000 UTC m=+3.397275713" lastFinishedPulling="2026-04-16 14:53:03.714818065 +0000 UTC m=+34.633179309" observedRunningTime="2026-04-16 14:53:07.01492188 +0000 UTC m=+37.933283097" watchObservedRunningTime="2026-04-16 14:53:07.016035655 +0000 UTC m=+37.934396861" Apr 16 14:53:08.000251 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:08.000216 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:53:08.014297 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:08.014241 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-c78tw" podStartSLOduration=36.029883367 podStartE2EDuration="39.014225021s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:53:03.896975813 +0000 UTC m=+34.815336996" lastFinishedPulling="2026-04-16 14:53:06.881317462 +0000 UTC m=+37.799678650" observedRunningTime="2026-04-16 14:53:08.013736158 +0000 UTC m=+38.932097364" watchObservedRunningTime="2026-04-16 14:53:08.014225021 +0000 UTC m=+38.932586218" Apr 16 14:53:10.230328 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:10.230293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:10.230682 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:10.230357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:10.230682 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:10.230443 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:10.230682 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:10.230444 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:10.230682 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:10.230493 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls podName:7ede1baa-a6e7-4b5e-8723-94a6c70847e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:18.230480288 +0000 UTC m=+49.148841471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls") pod "dns-default-wwb2b" (UID: "7ede1baa-a6e7-4b5e-8723-94a6c70847e3") : secret "dns-default-metrics-tls" not found Apr 16 14:53:10.230682 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:10.230506 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert podName:fb790f7c-1dc9-4bf8-a9e6-1054b49e346c nodeName:}" failed. No retries permitted until 2026-04-16 14:53:18.230500993 +0000 UTC m=+49.148862177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert") pod "ingress-canary-n8878" (UID: "fb790f7c-1dc9-4bf8-a9e6-1054b49e346c") : secret "canary-serving-cert" not found Apr 16 14:53:18.281655 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:18.281618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:18.281655 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:18.281661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:18.282187 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:18.281759 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:18.282187 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:18.281788 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:18.282187 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:18.281864 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert podName:fb790f7c-1dc9-4bf8-a9e6-1054b49e346c nodeName:}" failed. No retries permitted until 2026-04-16 14:53:34.281845423 +0000 UTC m=+65.200206630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert") pod "ingress-canary-n8878" (UID: "fb790f7c-1dc9-4bf8-a9e6-1054b49e346c") : secret "canary-serving-cert" not found Apr 16 14:53:18.282187 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:18.281879 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls podName:7ede1baa-a6e7-4b5e-8723-94a6c70847e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:34.281873115 +0000 UTC m=+65.200234297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls") pod "dns-default-wwb2b" (UID: "7ede1baa-a6e7-4b5e-8723-94a6c70847e3") : secret "dns-default-metrics-tls" not found Apr 16 14:53:27.975706 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:27.975680 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r7dqw" Apr 16 14:53:34.286606 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:34.286566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:53:34.287054 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:34.286630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:53:34.287054 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:34.286719 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:34.287054 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:34.286733 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:34.287054 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:34.286790 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls podName:7ede1baa-a6e7-4b5e-8723-94a6c70847e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:06.286757633 +0000 UTC m=+97.205118815 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls") pod "dns-default-wwb2b" (UID: "7ede1baa-a6e7-4b5e-8723-94a6c70847e3") : secret "dns-default-metrics-tls" not found Apr 16 14:53:34.287054 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:34.286814 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert podName:fb790f7c-1dc9-4bf8-a9e6-1054b49e346c nodeName:}" failed. No retries permitted until 2026-04-16 14:54:06.286797828 +0000 UTC m=+97.205159011 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert") pod "ingress-canary-n8878" (UID: "fb790f7c-1dc9-4bf8-a9e6-1054b49e346c") : secret "canary-serving-cert" not found Apr 16 14:53:35.494488 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:35.494445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:53:35.494891 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:35.494593 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:35.494891 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:53:35.494672 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs podName:fc2d28ab-f651-462e-ae85-98e9780905b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:39.494650908 +0000 UTC m=+130.413012095 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs") pod "network-metrics-daemon-7wx6z" (UID: "fc2d28ab-f651-462e-ae85-98e9780905b0") : secret "metrics-daemon-secret" not found Apr 16 14:53:39.005083 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:53:39.005056 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-c78tw" Apr 16 14:54:06.307547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:06.307412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:54:06.307547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:06.307468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:54:06.308065 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:06.307563 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:06.308065 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:06.307642 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls podName:7ede1baa-a6e7-4b5e-8723-94a6c70847e3 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:10.307626349 +0000 UTC m=+161.225987532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls") pod "dns-default-wwb2b" (UID: "7ede1baa-a6e7-4b5e-8723-94a6c70847e3") : secret "dns-default-metrics-tls" not found Apr 16 14:54:06.308065 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:06.307566 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:06.308065 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:06.307722 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert podName:fb790f7c-1dc9-4bf8-a9e6-1054b49e346c nodeName:}" failed. No retries permitted until 2026-04-16 14:55:10.307706338 +0000 UTC m=+161.226067535 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert") pod "ingress-canary-n8878" (UID: "fb790f7c-1dc9-4bf8-a9e6-1054b49e346c") : secret "canary-serving-cert" not found Apr 16 14:54:38.589663 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.589628 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-g8xfd"] Apr 16 14:54:38.592675 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.592651 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs"] Apr 16 14:54:38.592831 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.592803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.595081 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.595053 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:54:38.595439 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.595084 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 14:54:38.595532 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.595494 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-b5dxd\"" Apr 16 14:54:38.595592 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.595245 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 14:54:38.596239 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.595157 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:54:38.596239 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.596099 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:38.598689 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.598393 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 14:54:38.598834 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.598700 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:54:38.598834 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.598790 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:54:38.599324 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.599300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-vbs67\"" Apr 16 14:54:38.600046 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.600017 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 14:54:38.601985 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.601964 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 14:54:38.603788 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.603750 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs"] Apr 16 14:54:38.607757 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.607737 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-g8xfd"] Apr 16 14:54:38.683218 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.683181 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v"] Apr 16 14:54:38.686025 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.686003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v" Apr 16 14:54:38.687740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.687718 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:38.687865 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.687812 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:38.688080 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.688064 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mxfpd\"" Apr 16 14:54:38.692319 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.692290 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v"] Apr 16 14:54:38.721220 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.721190 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2201f8ec-763d-4bde-9b0c-b412c0a2c025-snapshots\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.721506 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.721486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2201f8ec-763d-4bde-9b0c-b412c0a2c025-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.721662 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.721649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2201f8ec-763d-4bde-9b0c-b412c0a2c025-serving-cert\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.721866 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.721844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2201f8ec-763d-4bde-9b0c-b412c0a2c025-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.722047 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.722013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-525c4\" (UniqueName: \"kubernetes.io/projected/f77a2c7c-9c50-400a-8982-b3e524240d5f-kube-api-access-525c4\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:38.722232 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.722216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:38.722410 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.722396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f77a2c7c-9c50-400a-8982-b3e524240d5f-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:38.722547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.722535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2201f8ec-763d-4bde-9b0c-b412c0a2c025-tmp\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.722682 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.722668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k26qm\" (UniqueName: \"kubernetes.io/projected/2201f8ec-763d-4bde-9b0c-b412c0a2c025-kube-api-access-k26qm\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.823228 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2201f8ec-763d-4bde-9b0c-b412c0a2c025-serving-cert\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.823228 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2201f8ec-763d-4bde-9b0c-b412c0a2c025-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.823422 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-525c4\" (UniqueName: \"kubernetes.io/projected/f77a2c7c-9c50-400a-8982-b3e524240d5f-kube-api-access-525c4\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:38.823457 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmlvk\" (UniqueName: \"kubernetes.io/projected/7ce1ceef-1ec7-4e59-a113-546b447470ba-kube-api-access-lmlvk\") pod \"volume-data-source-validator-7d955d5dd4-t267v\" (UID: \"7ce1ceef-1ec7-4e59-a113-546b447470ba\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v" Apr 16 14:54:38.823493 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:38.823552 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823538 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f77a2c7c-9c50-400a-8982-b3e524240d5f-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:38.823604 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2201f8ec-763d-4bde-9b0c-b412c0a2c025-tmp\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.823604 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k26qm\" (UniqueName: \"kubernetes.io/projected/2201f8ec-763d-4bde-9b0c-b412c0a2c025-kube-api-access-k26qm\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.823695 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:38.823599 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:38.823695 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2201f8ec-763d-4bde-9b0c-b412c0a2c025-snapshots\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.823695 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2201f8ec-763d-4bde-9b0c-b412c0a2c025-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.823695 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:38.823667 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls podName:f77a2c7c-9c50-400a-8982-b3e524240d5f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:39.323647064 +0000 UTC m=+130.242008272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-wddbs" (UID: "f77a2c7c-9c50-400a-8982-b3e524240d5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:38.823919 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2201f8ec-763d-4bde-9b0c-b412c0a2c025-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.823979 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.823960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2201f8ec-763d-4bde-9b0c-b412c0a2c025-tmp\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.824320 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.824299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2201f8ec-763d-4bde-9b0c-b412c0a2c025-snapshots\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.824355 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.824329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f77a2c7c-9c50-400a-8982-b3e524240d5f-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:38.824466 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.824450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2201f8ec-763d-4bde-9b0c-b412c0a2c025-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.827336 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.827314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2201f8ec-763d-4bde-9b0c-b412c0a2c025-serving-cert\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.830780 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.830745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k26qm\" (UniqueName: \"kubernetes.io/projected/2201f8ec-763d-4bde-9b0c-b412c0a2c025-kube-api-access-k26qm\") pod \"insights-operator-5785d4fcdd-g8xfd\" (UID: \"2201f8ec-763d-4bde-9b0c-b412c0a2c025\") " pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.830962 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.830933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-525c4\" (UniqueName: \"kubernetes.io/projected/f77a2c7c-9c50-400a-8982-b3e524240d5f-kube-api-access-525c4\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:38.905746 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.905717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" Apr 16 14:54:38.924818 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.924788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmlvk\" (UniqueName: \"kubernetes.io/projected/7ce1ceef-1ec7-4e59-a113-546b447470ba-kube-api-access-lmlvk\") pod \"volume-data-source-validator-7d955d5dd4-t267v\" (UID: \"7ce1ceef-1ec7-4e59-a113-546b447470ba\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v" Apr 16 14:54:38.934530 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.934501 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmlvk\" (UniqueName: \"kubernetes.io/projected/7ce1ceef-1ec7-4e59-a113-546b447470ba-kube-api-access-lmlvk\") pod \"volume-data-source-validator-7d955d5dd4-t267v\" (UID: \"7ce1ceef-1ec7-4e59-a113-546b447470ba\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v" Apr 16 14:54:38.995023 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:38.994993 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v" Apr 16 14:54:39.017910 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.017874 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-g8xfd"] Apr 16 14:54:39.021327 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:54:39.021294 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2201f8ec_763d_4bde_9b0c_b412c0a2c025.slice/crio-10bd27a21eea515e7f5b965dd858c35fcababd713667989e950976ba2597cb22 WatchSource:0}: Error finding container 10bd27a21eea515e7f5b965dd858c35fcababd713667989e950976ba2597cb22: Status 404 returned error can't find the container with id 10bd27a21eea515e7f5b965dd858c35fcababd713667989e950976ba2597cb22 Apr 16 14:54:39.104455 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.104428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v"] Apr 16 14:54:39.107799 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:54:39.107751 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce1ceef_1ec7_4e59_a113_546b447470ba.slice/crio-c7d2bbb7abd6479adbaba8f0eda842a3184c8a5823d2090990958f6194f4f333 WatchSource:0}: Error finding container c7d2bbb7abd6479adbaba8f0eda842a3184c8a5823d2090990958f6194f4f333: Status 404 returned error can't find the container with id c7d2bbb7abd6479adbaba8f0eda842a3184c8a5823d2090990958f6194f4f333 Apr 16 14:54:39.163190 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.163118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v" event={"ID":"7ce1ceef-1ec7-4e59-a113-546b447470ba","Type":"ContainerStarted","Data":"c7d2bbb7abd6479adbaba8f0eda842a3184c8a5823d2090990958f6194f4f333"} Apr 16 14:54:39.164000 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.163976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" event={"ID":"2201f8ec-763d-4bde-9b0c-b412c0a2c025","Type":"ContainerStarted","Data":"10bd27a21eea515e7f5b965dd858c35fcababd713667989e950976ba2597cb22"} Apr 16 14:54:39.328595 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.328562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:39.328754 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:39.328709 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:39.328822 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:39.328788 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls podName:f77a2c7c-9c50-400a-8982-b3e524240d5f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.328757153 +0000 UTC m=+131.247118336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-wddbs" (UID: "f77a2c7c-9c50-400a-8982-b3e524240d5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:39.429511 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.429435 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7f47bf7584-lxdlq"] Apr 16 14:54:39.432479 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.432456 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.434370 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.434337 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:54:39.434495 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.434395 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:54:39.434645 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.434626 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:54:39.434815 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.434790 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-g2vw7\"" Apr 16 14:54:39.439957 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.439918 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:54:39.446577 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.446551 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f47bf7584-lxdlq"] Apr 16 14:54:39.530479 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.530414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-installation-pull-secrets\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.530479 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.530473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/62e35365-3b0f-458d-9bfa-47b33abcf28e-ca-trust-extracted\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.530720 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.530523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.530720 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.530543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-certificates\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.530720 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.530612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-image-registry-private-configuration\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.530720 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.530692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-bound-sa-token\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.530957 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.530751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:54:39.530957 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.530807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-trusted-ca\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.530957 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.530841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxfr5\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-kube-api-access-xxfr5\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.530957 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:39.530941 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:39.531141 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:39.531012 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs podName:fc2d28ab-f651-462e-ae85-98e9780905b0 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:41.530992352 +0000 UTC m=+252.449353548 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs") pod "network-metrics-daemon-7wx6z" (UID: "fc2d28ab-f651-462e-ae85-98e9780905b0") : secret "metrics-daemon-secret" not found Apr 16 14:54:39.632097 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.632047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-bound-sa-token\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.632547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.632111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-trusted-ca\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.632547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.632142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxfr5\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-kube-api-access-xxfr5\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.632547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.632317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-installation-pull-secrets\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.632547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.632369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/62e35365-3b0f-458d-9bfa-47b33abcf28e-ca-trust-extracted\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.632547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.632415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.632547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.632456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-certificates\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.632547 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.632502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-image-registry-private-configuration\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.632973 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:39.632580 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:39.632973 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:39.632600 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f47bf7584-lxdlq: secret "image-registry-tls" not found Apr 16 14:54:39.632973 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:39.632664 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls podName:62e35365-3b0f-458d-9bfa-47b33abcf28e nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.132643448 +0000 UTC m=+131.051004634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls") pod "image-registry-7f47bf7584-lxdlq" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e") : secret "image-registry-tls" not found Apr 16 14:54:39.632973 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.632749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/62e35365-3b0f-458d-9bfa-47b33abcf28e-ca-trust-extracted\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.633174 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.633085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-certificates\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.633316 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.633268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-trusted-ca\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.635192 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.635152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-image-registry-private-configuration\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.635311 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.635287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-installation-pull-secrets\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.640663 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.640635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-bound-sa-token\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:39.640860 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:39.640832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxfr5\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-kube-api-access-xxfr5\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:40.137123 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:40.137083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:40.137321 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:40.137231 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:40.137321 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:40.137257 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f47bf7584-lxdlq: secret "image-registry-tls" not found Apr 16 14:54:40.137321 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:40.137322 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls podName:62e35365-3b0f-458d-9bfa-47b33abcf28e nodeName:}" failed. No retries permitted until 2026-04-16 14:54:41.137305892 +0000 UTC m=+132.055667075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls") pod "image-registry-7f47bf7584-lxdlq" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e") : secret "image-registry-tls" not found Apr 16 14:54:40.338990 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:40.338951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:40.339168 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:40.339096 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:40.339225 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:40.339170 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls podName:f77a2c7c-9c50-400a-8982-b3e524240d5f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:42.339152661 +0000 UTC m=+133.257513864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-wddbs" (UID: "f77a2c7c-9c50-400a-8982-b3e524240d5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:41.143698 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.143612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:41.144071 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:41.143760 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:41.144071 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:41.143795 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f47bf7584-lxdlq: secret "image-registry-tls" not found Apr 16 14:54:41.144071 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:41.143848 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls podName:62e35365-3b0f-458d-9bfa-47b33abcf28e nodeName:}" failed. No retries permitted until 2026-04-16 14:54:43.143830263 +0000 UTC m=+134.062191448 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls") pod "image-registry-7f47bf7584-lxdlq" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e") : secret "image-registry-tls" not found Apr 16 14:54:41.169631 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.169599 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v" event={"ID":"7ce1ceef-1ec7-4e59-a113-546b447470ba","Type":"ContainerStarted","Data":"e8279c687632803001bf6888ad04ee114da97fcbefa31e50185dd0b1e9a3fd1c"} Apr 16 14:54:41.170847 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.170826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" event={"ID":"2201f8ec-763d-4bde-9b0c-b412c0a2c025","Type":"ContainerStarted","Data":"7d5bdae6bf3c616078164ec9ed0275da0369966b9695b6d62b101bee00619d4a"} Apr 16 14:54:41.182556 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.182509 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-t267v" podStartSLOduration=1.5549092820000001 podStartE2EDuration="3.182494668s" podCreationTimestamp="2026-04-16 14:54:38 +0000 UTC" firstStartedPulling="2026-04-16 14:54:39.109519849 +0000 UTC m=+130.027881032" lastFinishedPulling="2026-04-16 14:54:40.737105234 +0000 UTC m=+131.655466418" observedRunningTime="2026-04-16 14:54:41.181808883 +0000 UTC m=+132.100170093" watchObservedRunningTime="2026-04-16 14:54:41.182494668 +0000 UTC m=+132.100855877" Apr 16 14:54:41.197834 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.197790 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" podStartSLOduration=1.479715563 podStartE2EDuration="3.197759608s" podCreationTimestamp="2026-04-16 14:54:38 +0000 UTC" firstStartedPulling="2026-04-16 14:54:39.023159732 +0000 UTC m=+129.941520914" lastFinishedPulling="2026-04-16 14:54:40.741203759 +0000 UTC m=+131.659564959" observedRunningTime="2026-04-16 14:54:41.197360925 +0000 UTC m=+132.115722130" watchObservedRunningTime="2026-04-16 14:54:41.197759608 +0000 UTC m=+132.116120874" Apr 16 14:54:41.572885 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.572834 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6"] Apr 16 14:54:41.576157 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.576140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.578032 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.577995 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 14:54:41.578032 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.578022 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:41.578245 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.578054 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:41.578245 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.578006 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 14:54:41.578245 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.577995 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-7rqt7\"" Apr 16 14:54:41.583390 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.583371 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6"] Apr 16 14:54:41.646862 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.646840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff4ddc68-4c66-4144-b764-4cfde96015d7-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-x98m6\" (UID: \"ff4ddc68-4c66-4144-b764-4cfde96015d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.646983 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.646872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxrfn\" (UniqueName: \"kubernetes.io/projected/ff4ddc68-4c66-4144-b764-4cfde96015d7-kube-api-access-dxrfn\") pod \"kube-storage-version-migrator-operator-756bb7d76f-x98m6\" (UID: \"ff4ddc68-4c66-4144-b764-4cfde96015d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.647028 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.647011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff4ddc68-4c66-4144-b764-4cfde96015d7-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-x98m6\" (UID: \"ff4ddc68-4c66-4144-b764-4cfde96015d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.747946 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.747914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff4ddc68-4c66-4144-b764-4cfde96015d7-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-x98m6\" (UID: \"ff4ddc68-4c66-4144-b764-4cfde96015d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.748110 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.747954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff4ddc68-4c66-4144-b764-4cfde96015d7-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-x98m6\" (UID: \"ff4ddc68-4c66-4144-b764-4cfde96015d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.748110 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.747980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxrfn\" (UniqueName: \"kubernetes.io/projected/ff4ddc68-4c66-4144-b764-4cfde96015d7-kube-api-access-dxrfn\") pod \"kube-storage-version-migrator-operator-756bb7d76f-x98m6\" (UID: \"ff4ddc68-4c66-4144-b764-4cfde96015d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.748600 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.748577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff4ddc68-4c66-4144-b764-4cfde96015d7-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-x98m6\" (UID: \"ff4ddc68-4c66-4144-b764-4cfde96015d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.750236 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.750218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff4ddc68-4c66-4144-b764-4cfde96015d7-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-x98m6\" (UID: \"ff4ddc68-4c66-4144-b764-4cfde96015d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.754996 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.754979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxrfn\" (UniqueName: \"kubernetes.io/projected/ff4ddc68-4c66-4144-b764-4cfde96015d7-kube-api-access-dxrfn\") pod \"kube-storage-version-migrator-operator-756bb7d76f-x98m6\" (UID: \"ff4ddc68-4c66-4144-b764-4cfde96015d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:41.886651 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:41.886578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" Apr 16 14:54:42.003137 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:42.003107 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6"] Apr 16 14:54:42.007873 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:54:42.007843 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4ddc68_4c66_4144_b764_4cfde96015d7.slice/crio-7658aab0993edce8c9f07fe25799ad6263fd2750015a68a20a03880efb991285 WatchSource:0}: Error finding container 7658aab0993edce8c9f07fe25799ad6263fd2750015a68a20a03880efb991285: Status 404 returned error can't find the container with id 7658aab0993edce8c9f07fe25799ad6263fd2750015a68a20a03880efb991285 Apr 16 14:54:42.173758 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:42.173726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" event={"ID":"ff4ddc68-4c66-4144-b764-4cfde96015d7","Type":"ContainerStarted","Data":"7658aab0993edce8c9f07fe25799ad6263fd2750015a68a20a03880efb991285"} Apr 16 14:54:42.354106 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:42.354074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:42.354289 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:42.354180 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:42.354289 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:42.354242 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls podName:f77a2c7c-9c50-400a-8982-b3e524240d5f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:46.354228444 +0000 UTC m=+137.272589627 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-wddbs" (UID: "f77a2c7c-9c50-400a-8982-b3e524240d5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:43.160907 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:43.160869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:43.161078 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:43.161030 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:43.161078 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:43.161055 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f47bf7584-lxdlq: secret "image-registry-tls" not found Apr 16 14:54:43.161198 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:43.161123 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls podName:62e35365-3b0f-458d-9bfa-47b33abcf28e nodeName:}" failed. No retries permitted until 2026-04-16 14:54:47.161102489 +0000 UTC m=+138.079463679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls") pod "image-registry-7f47bf7584-lxdlq" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e") : secret "image-registry-tls" not found Apr 16 14:54:44.028971 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.028940 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s9crx_3cebcaa8-957d-4f1e-b4f8-90637dae2bc0/dns-node-resolver/0.log" Apr 16 14:54:44.178040 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.178003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" event={"ID":"ff4ddc68-4c66-4144-b764-4cfde96015d7","Type":"ContainerStarted","Data":"b31434716d9645094c1366332de9387cb36b3cc341c3314535cc567a0b34d0ff"} Apr 16 14:54:44.192308 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.192259 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" podStartSLOduration=1.708944603 podStartE2EDuration="3.192243343s" podCreationTimestamp="2026-04-16 14:54:41 +0000 UTC" firstStartedPulling="2026-04-16 14:54:42.009908158 +0000 UTC m=+132.928269342" lastFinishedPulling="2026-04-16 14:54:43.493206899 +0000 UTC m=+134.411568082" observedRunningTime="2026-04-16 14:54:44.190898621 +0000 UTC m=+135.109259840" watchObservedRunningTime="2026-04-16 14:54:44.192243343 +0000 UTC m=+135.110604549" Apr 16 14:54:44.573099 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.573063 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87"] Apr 16 14:54:44.576024 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.576009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.577983 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.577957 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 14:54:44.578104 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.577995 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 14:54:44.578104 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.578010 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jwvrs\"" Apr 16 14:54:44.578554 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.578535 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:44.578642 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.578553 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:44.584551 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.584532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87"] Apr 16 14:54:44.677604 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.677568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a682a0c3-02b2-44f9-b8bb-40ab354c9809-config\") pod \"service-ca-operator-69965bb79d-24n87\" (UID: \"a682a0c3-02b2-44f9-b8bb-40ab354c9809\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.677806 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.677615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a682a0c3-02b2-44f9-b8bb-40ab354c9809-serving-cert\") pod \"service-ca-operator-69965bb79d-24n87\" (UID: \"a682a0c3-02b2-44f9-b8bb-40ab354c9809\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.677806 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.677694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ch8k\" (UniqueName: \"kubernetes.io/projected/a682a0c3-02b2-44f9-b8bb-40ab354c9809-kube-api-access-6ch8k\") pod \"service-ca-operator-69965bb79d-24n87\" (UID: \"a682a0c3-02b2-44f9-b8bb-40ab354c9809\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.778749 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.778719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a682a0c3-02b2-44f9-b8bb-40ab354c9809-config\") pod \"service-ca-operator-69965bb79d-24n87\" (UID: \"a682a0c3-02b2-44f9-b8bb-40ab354c9809\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.778920 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.778755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a682a0c3-02b2-44f9-b8bb-40ab354c9809-serving-cert\") pod \"service-ca-operator-69965bb79d-24n87\" (UID: \"a682a0c3-02b2-44f9-b8bb-40ab354c9809\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.778920 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.778834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ch8k\" (UniqueName: \"kubernetes.io/projected/a682a0c3-02b2-44f9-b8bb-40ab354c9809-kube-api-access-6ch8k\") pod \"service-ca-operator-69965bb79d-24n87\" (UID: \"a682a0c3-02b2-44f9-b8bb-40ab354c9809\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.779354 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.779333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a682a0c3-02b2-44f9-b8bb-40ab354c9809-config\") pod \"service-ca-operator-69965bb79d-24n87\" (UID: \"a682a0c3-02b2-44f9-b8bb-40ab354c9809\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.781036 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.781019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a682a0c3-02b2-44f9-b8bb-40ab354c9809-serving-cert\") pod \"service-ca-operator-69965bb79d-24n87\" (UID: \"a682a0c3-02b2-44f9-b8bb-40ab354c9809\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.785817 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.785794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ch8k\" (UniqueName: \"kubernetes.io/projected/a682a0c3-02b2-44f9-b8bb-40ab354c9809-kube-api-access-6ch8k\") pod \"service-ca-operator-69965bb79d-24n87\" (UID: \"a682a0c3-02b2-44f9-b8bb-40ab354c9809\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.885200 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.885119 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" Apr 16 14:54:44.994338 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.994307 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb"] Apr 16 14:54:44.998453 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:44.998436 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb" Apr 16 14:54:45.000128 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.000088 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87"] Apr 16 14:54:45.000442 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.000424 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-2kj5t\"" Apr 16 14:54:45.003286 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.003262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb"] Apr 16 14:54:45.004043 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:54:45.004020 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda682a0c3_02b2_44f9_b8bb_40ab354c9809.slice/crio-0f68bdb7f878d28d8faea72804b74d3223287f351f2fabf365d40620ad9bd295 WatchSource:0}: Error finding container 0f68bdb7f878d28d8faea72804b74d3223287f351f2fabf365d40620ad9bd295: Status 404 returned error can't find the container with id 0f68bdb7f878d28d8faea72804b74d3223287f351f2fabf365d40620ad9bd295 Apr 16 14:54:45.028928 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.028909 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tsg7q_e27e48da-a6dc-4e84-87f4-01916a11e065/node-ca/0.log" Apr 16 14:54:45.081487 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.081456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fntk9\" (UniqueName: \"kubernetes.io/projected/2413d25f-8416-4493-bebc-794f33e6f210-kube-api-access-fntk9\") pod \"network-check-source-7b678d77c7-whfzb\" (UID: \"2413d25f-8416-4493-bebc-794f33e6f210\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb" Apr 16 14:54:45.181016 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.180985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" event={"ID":"a682a0c3-02b2-44f9-b8bb-40ab354c9809","Type":"ContainerStarted","Data":"0f68bdb7f878d28d8faea72804b74d3223287f351f2fabf365d40620ad9bd295"} Apr 16 14:54:45.182475 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.182451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fntk9\" (UniqueName: \"kubernetes.io/projected/2413d25f-8416-4493-bebc-794f33e6f210-kube-api-access-fntk9\") pod \"network-check-source-7b678d77c7-whfzb\" (UID: \"2413d25f-8416-4493-bebc-794f33e6f210\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb" Apr 16 14:54:45.201151 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.201122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fntk9\" (UniqueName: \"kubernetes.io/projected/2413d25f-8416-4493-bebc-794f33e6f210-kube-api-access-fntk9\") pod \"network-check-source-7b678d77c7-whfzb\" (UID: \"2413d25f-8416-4493-bebc-794f33e6f210\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb" Apr 16 14:54:45.222375 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.222352 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z"] Apr 16 14:54:45.225480 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.225458 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z" Apr 16 14:54:45.227566 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.227543 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 14:54:45.227701 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.227640 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:45.227701 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.227652 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-zck49\"" Apr 16 14:54:45.231908 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.231885 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z"] Apr 16 14:54:45.314425 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.314396 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb" Apr 16 14:54:45.384605 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.384402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwrv\" (UniqueName: \"kubernetes.io/projected/fd05d685-bbeb-4d6a-b14d-ec2b3dd85339-kube-api-access-2wwrv\") pod \"migrator-64d4d94569-gx74z\" (UID: \"fd05d685-bbeb-4d6a-b14d-ec2b3dd85339\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z" Apr 16 14:54:45.428741 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.428708 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb"] Apr 16 14:54:45.432670 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:54:45.432610 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2413d25f_8416_4493_bebc_794f33e6f210.slice/crio-d6cfdf74a64a5b2eb6a08464ac046cbc445f3d31efb53bbca94cbd13029b27ba WatchSource:0}: Error finding container d6cfdf74a64a5b2eb6a08464ac046cbc445f3d31efb53bbca94cbd13029b27ba: Status 404 returned error can't find the container with id d6cfdf74a64a5b2eb6a08464ac046cbc445f3d31efb53bbca94cbd13029b27ba Apr 16 14:54:45.485996 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.485949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwrv\" (UniqueName: \"kubernetes.io/projected/fd05d685-bbeb-4d6a-b14d-ec2b3dd85339-kube-api-access-2wwrv\") pod \"migrator-64d4d94569-gx74z\" (UID: \"fd05d685-bbeb-4d6a-b14d-ec2b3dd85339\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z" Apr 16 14:54:45.493677 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.493647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwrv\" (UniqueName: \"kubernetes.io/projected/fd05d685-bbeb-4d6a-b14d-ec2b3dd85339-kube-api-access-2wwrv\") pod \"migrator-64d4d94569-gx74z\" (UID: \"fd05d685-bbeb-4d6a-b14d-ec2b3dd85339\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z" Apr 16 14:54:45.535749 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.535704 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z" Apr 16 14:54:45.670120 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:45.670090 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z"] Apr 16 14:54:45.674112 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:54:45.674086 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd05d685_bbeb_4d6a_b14d_ec2b3dd85339.slice/crio-1407486dc64ddd70348d2a767b0dd7a6f5ebf4a88d21b9e64447ac07c88d6d78 WatchSource:0}: Error finding container 1407486dc64ddd70348d2a767b0dd7a6f5ebf4a88d21b9e64447ac07c88d6d78: Status 404 returned error can't find the container with id 1407486dc64ddd70348d2a767b0dd7a6f5ebf4a88d21b9e64447ac07c88d6d78 Apr 16 14:54:46.186446 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:46.186399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb" event={"ID":"2413d25f-8416-4493-bebc-794f33e6f210","Type":"ContainerStarted","Data":"281f6264eaeb5fd032fcb6a4b3c50b879a1a0853743f34de1c3c94d0afb11abb"} Apr 16 14:54:46.186446 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:46.186442 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb" event={"ID":"2413d25f-8416-4493-bebc-794f33e6f210","Type":"ContainerStarted","Data":"d6cfdf74a64a5b2eb6a08464ac046cbc445f3d31efb53bbca94cbd13029b27ba"} Apr 16 14:54:46.187876 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:46.187840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z" event={"ID":"fd05d685-bbeb-4d6a-b14d-ec2b3dd85339","Type":"ContainerStarted","Data":"1407486dc64ddd70348d2a767b0dd7a6f5ebf4a88d21b9e64447ac07c88d6d78"} Apr 16 14:54:46.201205 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:46.201143 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-whfzb" podStartSLOduration=2.201126713 podStartE2EDuration="2.201126713s" podCreationTimestamp="2026-04-16 14:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:54:46.20054187 +0000 UTC m=+137.118903076" watchObservedRunningTime="2026-04-16 14:54:46.201126713 +0000 UTC m=+137.119487938" Apr 16 14:54:46.393530 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:46.393490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:46.393701 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:46.393672 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:46.393784 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:46.393756 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls podName:f77a2c7c-9c50-400a-8982-b3e524240d5f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:54.393734222 +0000 UTC m=+145.312095411 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-wddbs" (UID: "f77a2c7c-9c50-400a-8982-b3e524240d5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:47.192424 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:47.192325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z" event={"ID":"fd05d685-bbeb-4d6a-b14d-ec2b3dd85339","Type":"ContainerStarted","Data":"6ce39aaff5952a0375eb90f40ced9b362e9f03282b0915603e4dc3cc0726ad9e"} Apr 16 14:54:47.192424 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:47.192370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z" event={"ID":"fd05d685-bbeb-4d6a-b14d-ec2b3dd85339","Type":"ContainerStarted","Data":"b93c99dcf0827389ea2171551b792749323b6fe497cf65043d834d49b5a3012b"} Apr 16 14:54:47.193947 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:47.193917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" event={"ID":"a682a0c3-02b2-44f9-b8bb-40ab354c9809","Type":"ContainerStarted","Data":"f74f0d638ae6da6143fe0f39a3a7b7aacdc28ea860bf74cf9e3b4016c3107c0b"} Apr 16 14:54:47.201291 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:47.201262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:47.201446 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:47.201431 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:47.201512 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:47.201450 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f47bf7584-lxdlq: secret "image-registry-tls" not found Apr 16 14:54:47.201512 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:47.201501 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls podName:62e35365-3b0f-458d-9bfa-47b33abcf28e nodeName:}" failed. No retries permitted until 2026-04-16 14:54:55.201484643 +0000 UTC m=+146.119845832 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls") pod "image-registry-7f47bf7584-lxdlq" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e") : secret "image-registry-tls" not found Apr 16 14:54:47.208331 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:47.208286 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gx74z" podStartSLOduration=0.953126762 podStartE2EDuration="2.208270112s" podCreationTimestamp="2026-04-16 14:54:45 +0000 UTC" firstStartedPulling="2026-04-16 14:54:45.676618382 +0000 UTC m=+136.594979587" lastFinishedPulling="2026-04-16 14:54:46.931761739 +0000 UTC m=+137.850122937" observedRunningTime="2026-04-16 14:54:47.207137985 +0000 UTC m=+138.125499191" watchObservedRunningTime="2026-04-16 14:54:47.208270112 +0000 UTC m=+138.126631319" Apr 16 14:54:47.220353 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:47.220298 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" podStartSLOduration=1.29436671 podStartE2EDuration="3.220278763s" podCreationTimestamp="2026-04-16 14:54:44 +0000 UTC" firstStartedPulling="2026-04-16 14:54:45.006241218 +0000 UTC m=+135.924602401" lastFinishedPulling="2026-04-16 14:54:46.932153258 +0000 UTC m=+137.850514454" observedRunningTime="2026-04-16 14:54:47.219166421 +0000 UTC m=+138.137527627" watchObservedRunningTime="2026-04-16 14:54:47.220278763 +0000 UTC m=+138.138639972" Apr 16 14:54:54.460429 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:54.460378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:54:54.460821 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:54.460499 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:54.460821 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:54.460566 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls podName:f77a2c7c-9c50-400a-8982-b3e524240d5f nodeName:}" failed. No retries permitted until 2026-04-16 14:55:10.460550085 +0000 UTC m=+161.378911269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-wddbs" (UID: "f77a2c7c-9c50-400a-8982-b3e524240d5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:55.266702 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:54:55.266670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:54:55.266861 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:55.266823 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:55.266861 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:55.266842 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f47bf7584-lxdlq: secret "image-registry-tls" not found Apr 16 14:54:55.266945 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:54:55.266906 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls podName:62e35365-3b0f-458d-9bfa-47b33abcf28e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.266890142 +0000 UTC m=+162.185251326 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls") pod "image-registry-7f47bf7584-lxdlq" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e") : secret "image-registry-tls" not found Apr 16 14:55:05.491833 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:55:05.491792 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wwb2b" podUID="7ede1baa-a6e7-4b5e-8723-94a6c70847e3" Apr 16 14:55:05.498908 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:55:05.498876 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-n8878" podUID="fb790f7c-1dc9-4bf8-a9e6-1054b49e346c" Apr 16 14:55:05.820619 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:55:05.820530 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7wx6z" podUID="fc2d28ab-f651-462e-ae85-98e9780905b0" Apr 16 14:55:06.237860 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:06.237827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:55:06.238023 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:06.237831 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wwb2b" Apr 16 14:55:09.343570 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.343536 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f47bf7584-lxdlq"] Apr 16 14:55:09.344046 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:55:09.343756 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" podUID="62e35365-3b0f-458d-9bfa-47b33abcf28e" Apr 16 14:55:09.346258 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.346234 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nbr9m"] Apr 16 14:55:09.350312 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.350297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.352301 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.352277 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bg2mj\"" Apr 16 14:55:09.352488 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.352472 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:09.352598 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.352582 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:09.357968 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.357950 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nbr9m"] Apr 16 14:55:09.376266 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.376242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/47d8c674-a5dd-4235-aecc-2923a5a0809c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.376562 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.376540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/47d8c674-a5dd-4235-aecc-2923a5a0809c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.376846 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.376824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/47d8c674-a5dd-4235-aecc-2923a5a0809c-crio-socket\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.377002 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.376986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qrvt\" (UniqueName: \"kubernetes.io/projected/47d8c674-a5dd-4235-aecc-2923a5a0809c-kube-api-access-4qrvt\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.377146 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.377133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/47d8c674-a5dd-4235-aecc-2923a5a0809c-data-volume\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.440122 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.440093 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-kpbxh"] Apr 16 14:55:09.443214 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.443193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-kpbxh" Apr 16 14:55:09.445081 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.445048 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:55:09.445169 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.445107 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-wt8sz\"" Apr 16 14:55:09.445169 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.445141 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:55:09.452088 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.452067 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-kpbxh"] Apr 16 14:55:09.478081 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.478055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/47d8c674-a5dd-4235-aecc-2923a5a0809c-data-volume\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.478081 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.478090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/47d8c674-a5dd-4235-aecc-2923a5a0809c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.478269 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.478121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/47d8c674-a5dd-4235-aecc-2923a5a0809c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.478321 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.478300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ccdr\" (UniqueName: \"kubernetes.io/projected/3c7ac47a-551c-4399-8338-c3942554bedd-kube-api-access-7ccdr\") pod \"downloads-586b57c7b4-kpbxh\" (UID: \"3c7ac47a-551c-4399-8338-c3942554bedd\") " pod="openshift-console/downloads-586b57c7b4-kpbxh" Apr 16 14:55:09.478380 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.478365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/47d8c674-a5dd-4235-aecc-2923a5a0809c-crio-socket\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.478419 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.478402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qrvt\" (UniqueName: \"kubernetes.io/projected/47d8c674-a5dd-4235-aecc-2923a5a0809c-kube-api-access-4qrvt\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.478462 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.478427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/47d8c674-a5dd-4235-aecc-2923a5a0809c-crio-socket\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.478497 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.478472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/47d8c674-a5dd-4235-aecc-2923a5a0809c-data-volume\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.478677 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.478646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/47d8c674-a5dd-4235-aecc-2923a5a0809c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.480520 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.480496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/47d8c674-a5dd-4235-aecc-2923a5a0809c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.487006 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.486985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qrvt\" (UniqueName: \"kubernetes.io/projected/47d8c674-a5dd-4235-aecc-2923a5a0809c-kube-api-access-4qrvt\") pod \"insights-runtime-extractor-nbr9m\" (UID: \"47d8c674-a5dd-4235-aecc-2923a5a0809c\") " pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.579414 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.579374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ccdr\" (UniqueName: \"kubernetes.io/projected/3c7ac47a-551c-4399-8338-c3942554bedd-kube-api-access-7ccdr\") pod \"downloads-586b57c7b4-kpbxh\" (UID: \"3c7ac47a-551c-4399-8338-c3942554bedd\") " pod="openshift-console/downloads-586b57c7b4-kpbxh" Apr 16 14:55:09.586267 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.586246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ccdr\" (UniqueName: \"kubernetes.io/projected/3c7ac47a-551c-4399-8338-c3942554bedd-kube-api-access-7ccdr\") pod \"downloads-586b57c7b4-kpbxh\" (UID: \"3c7ac47a-551c-4399-8338-c3942554bedd\") " pod="openshift-console/downloads-586b57c7b4-kpbxh" Apr 16 14:55:09.659153 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.659132 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nbr9m" Apr 16 14:55:09.752680 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.752605 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-kpbxh" Apr 16 14:55:09.792198 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.792162 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nbr9m"] Apr 16 14:55:09.797985 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:09.797589 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47d8c674_a5dd_4235_aecc_2923a5a0809c.slice/crio-3f863ef913444f1fe7e0b652b6346caf02b48f3ffa897a70ddbb326879c980d1 WatchSource:0}: Error finding container 3f863ef913444f1fe7e0b652b6346caf02b48f3ffa897a70ddbb326879c980d1: Status 404 returned error can't find the container with id 3f863ef913444f1fe7e0b652b6346caf02b48f3ffa897a70ddbb326879c980d1 Apr 16 14:55:09.873375 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:09.873347 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-kpbxh"] Apr 16 14:55:09.877077 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:09.877047 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c7ac47a_551c_4399_8338_c3942554bedd.slice/crio-09cfbaaa32ea5a168df45d7a305ee88e62d773ebc49af41ea8a2847229b4287c WatchSource:0}: Error finding container 09cfbaaa32ea5a168df45d7a305ee88e62d773ebc49af41ea8a2847229b4287c: Status 404 returned error can't find the container with id 09cfbaaa32ea5a168df45d7a305ee88e62d773ebc49af41ea8a2847229b4287c Apr 16 14:55:10.248232 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.248194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nbr9m" event={"ID":"47d8c674-a5dd-4235-aecc-2923a5a0809c","Type":"ContainerStarted","Data":"b75bf02c5afba5687bfda2e006a62e4583a56dd9a7cb5a37a8cce02b093591f1"} Apr 16 14:55:10.248413 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.248233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nbr9m" event={"ID":"47d8c674-a5dd-4235-aecc-2923a5a0809c","Type":"ContainerStarted","Data":"3f863ef913444f1fe7e0b652b6346caf02b48f3ffa897a70ddbb326879c980d1"} Apr 16 14:55:10.249211 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.249180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-kpbxh" event={"ID":"3c7ac47a-551c-4399-8338-c3942554bedd","Type":"ContainerStarted","Data":"09cfbaaa32ea5a168df45d7a305ee88e62d773ebc49af41ea8a2847229b4287c"} Apr 16 14:55:10.249330 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.249229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:55:10.253219 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.253203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:55:10.284147 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284127 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-installation-pull-secrets\") pod \"62e35365-3b0f-458d-9bfa-47b33abcf28e\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " Apr 16 14:55:10.284235 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284158 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-trusted-ca\") pod \"62e35365-3b0f-458d-9bfa-47b33abcf28e\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " Apr 16 14:55:10.284235 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284178 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxfr5\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-kube-api-access-xxfr5\") pod \"62e35365-3b0f-458d-9bfa-47b33abcf28e\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " Apr 16 14:55:10.284235 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284204 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-image-registry-private-configuration\") pod \"62e35365-3b0f-458d-9bfa-47b33abcf28e\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " Apr 16 14:55:10.284362 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284248 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-bound-sa-token\") pod \"62e35365-3b0f-458d-9bfa-47b33abcf28e\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " Apr 16 14:55:10.284362 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284294 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/62e35365-3b0f-458d-9bfa-47b33abcf28e-ca-trust-extracted\") pod \"62e35365-3b0f-458d-9bfa-47b33abcf28e\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " Apr 16 14:55:10.284362 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284318 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-certificates\") pod \"62e35365-3b0f-458d-9bfa-47b33abcf28e\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " Apr 16 14:55:10.284610 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284570 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62e35365-3b0f-458d-9bfa-47b33abcf28e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "62e35365-3b0f-458d-9bfa-47b33abcf28e" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:55:10.284722 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284640 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "62e35365-3b0f-458d-9bfa-47b33abcf28e" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:10.284851 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.284827 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "62e35365-3b0f-458d-9bfa-47b33abcf28e" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:10.286381 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.286356 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "62e35365-3b0f-458d-9bfa-47b33abcf28e" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:10.286505 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.286447 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "62e35365-3b0f-458d-9bfa-47b33abcf28e" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:10.286505 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.286489 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-kube-api-access-xxfr5" (OuterVolumeSpecName: "kube-api-access-xxfr5") pod "62e35365-3b0f-458d-9bfa-47b33abcf28e" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e"). InnerVolumeSpecName "kube-api-access-xxfr5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:10.286603 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.286584 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "62e35365-3b0f-458d-9bfa-47b33abcf28e" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:10.385675 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.385642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:55:10.386124 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.385741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:55:10.386124 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.385806 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/62e35365-3b0f-458d-9bfa-47b33abcf28e-ca-trust-extracted\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:10.386124 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.385824 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-certificates\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:10.386124 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.385841 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-installation-pull-secrets\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:10.386124 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.385855 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62e35365-3b0f-458d-9bfa-47b33abcf28e-trusted-ca\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:10.386124 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.385868 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxfr5\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-kube-api-access-xxfr5\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:10.386124 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.385881 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/62e35365-3b0f-458d-9bfa-47b33abcf28e-image-registry-private-configuration\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:10.386124 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.385897 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-bound-sa-token\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:10.388298 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.388268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ede1baa-a6e7-4b5e-8723-94a6c70847e3-metrics-tls\") pod \"dns-default-wwb2b\" (UID: \"7ede1baa-a6e7-4b5e-8723-94a6c70847e3\") " pod="openshift-dns/dns-default-wwb2b" Apr 16 14:55:10.388524 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.388492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb790f7c-1dc9-4bf8-a9e6-1054b49e346c-cert\") pod \"ingress-canary-n8878\" (UID: \"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c\") " pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:55:10.440133 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.440102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9hwfz\"" Apr 16 14:55:10.440320 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.440217 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cvt8w\"" Apr 16 14:55:10.448915 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.448888 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n8878" Apr 16 14:55:10.449073 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.448996 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wwb2b" Apr 16 14:55:10.487230 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.487197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:55:10.489885 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.489856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f77a2c7c-9c50-400a-8982-b3e524240d5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-wddbs\" (UID: \"f77a2c7c-9c50-400a-8982-b3e524240d5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:55:10.636800 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.636748 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wwb2b"] Apr 16 14:55:10.640341 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:10.640312 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ede1baa_a6e7_4b5e_8723_94a6c70847e3.slice/crio-4a44ca6a4d9e668cf9428ee80035eaec5e5894f0b5d39850fcaa8a7a25b7ec07 WatchSource:0}: Error finding container 4a44ca6a4d9e668cf9428ee80035eaec5e5894f0b5d39850fcaa8a7a25b7ec07: Status 404 returned error can't find the container with id 4a44ca6a4d9e668cf9428ee80035eaec5e5894f0b5d39850fcaa8a7a25b7ec07 Apr 16 14:55:10.652485 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.652459 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n8878"] Apr 16 14:55:10.656700 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:10.656664 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb790f7c_1dc9_4bf8_a9e6_1054b49e346c.slice/crio-047fe6ac02f1da51b5a0015894078b3c0c3ad21608c01ade39c2516a18a6da5b WatchSource:0}: Error finding container 047fe6ac02f1da51b5a0015894078b3c0c3ad21608c01ade39c2516a18a6da5b: Status 404 returned error can't find the container with id 047fe6ac02f1da51b5a0015894078b3c0c3ad21608c01ade39c2516a18a6da5b Apr 16 14:55:10.711208 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.711178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" Apr 16 14:55:10.839041 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:10.839012 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs"] Apr 16 14:55:10.842291 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:10.842247 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf77a2c7c_9c50_400a_8982_b3e524240d5f.slice/crio-22881c2713a1bf1f1dccc2a49c35b857875152395f655d763cc5cb1bf7db18e7 WatchSource:0}: Error finding container 22881c2713a1bf1f1dccc2a49c35b857875152395f655d763cc5cb1bf7db18e7: Status 404 returned error can't find the container with id 22881c2713a1bf1f1dccc2a49c35b857875152395f655d763cc5cb1bf7db18e7 Apr 16 14:55:11.259962 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.259893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nbr9m" event={"ID":"47d8c674-a5dd-4235-aecc-2923a5a0809c","Type":"ContainerStarted","Data":"dc31440887f862c2910e6ea67909e82ae302d8cf0f167f8b9346d937196891f2"} Apr 16 14:55:11.262265 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.262214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" event={"ID":"f77a2c7c-9c50-400a-8982-b3e524240d5f","Type":"ContainerStarted","Data":"22881c2713a1bf1f1dccc2a49c35b857875152395f655d763cc5cb1bf7db18e7"} Apr 16 14:55:11.264078 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.264026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwb2b" event={"ID":"7ede1baa-a6e7-4b5e-8723-94a6c70847e3","Type":"ContainerStarted","Data":"4a44ca6a4d9e668cf9428ee80035eaec5e5894f0b5d39850fcaa8a7a25b7ec07"} Apr 16 14:55:11.269207 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.267832 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:55:11.269207 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.268656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n8878" event={"ID":"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c","Type":"ContainerStarted","Data":"047fe6ac02f1da51b5a0015894078b3c0c3ad21608c01ade39c2516a18a6da5b"} Apr 16 14:55:11.297191 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.297111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls\") pod \"image-registry-7f47bf7584-lxdlq\" (UID: \"62e35365-3b0f-458d-9bfa-47b33abcf28e\") " pod="openshift-image-registry/image-registry-7f47bf7584-lxdlq" Apr 16 14:55:11.297356 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:55:11.297249 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: object "openshift-image-registry"/"image-registry-tls" not registered Apr 16 14:55:11.297356 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:55:11.297267 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f47bf7584-lxdlq: object "openshift-image-registry"/"image-registry-tls" not registered Apr 16 14:55:11.297356 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:55:11.297325 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls podName:62e35365-3b0f-458d-9bfa-47b33abcf28e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:43.297302006 +0000 UTC m=+194.215663203 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls") pod "image-registry-7f47bf7584-lxdlq" (UID: "62e35365-3b0f-458d-9bfa-47b33abcf28e") : object "openshift-image-registry"/"image-registry-tls" not registered Apr 16 14:55:11.309909 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.309882 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f47bf7584-lxdlq"] Apr 16 14:55:11.313795 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.313720 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7f47bf7584-lxdlq"] Apr 16 14:55:11.398173 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.398140 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/62e35365-3b0f-458d-9bfa-47b33abcf28e-registry-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:11.806988 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:11.806953 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62e35365-3b0f-458d-9bfa-47b33abcf28e" path="/var/lib/kubelet/pods/62e35365-3b0f-458d-9bfa-47b33abcf28e/volumes" Apr 16 14:55:12.580104 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.580069 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69554f6858-9zslf"] Apr 16 14:55:12.584964 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.584928 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.587639 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.587343 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kgz7m\"" Apr 16 14:55:12.587639 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.587371 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:55:12.587639 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.587422 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:55:12.587639 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.587445 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:55:12.587959 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.587708 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:55:12.587959 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.587812 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:55:12.592430 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.592408 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69554f6858-9zslf"] Apr 16 14:55:12.609289 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.609265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-oauth-serving-cert\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.609404 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.609308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-config\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.609404 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.609348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-oauth-config\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.609506 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.609408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9tlg\" (UniqueName: \"kubernetes.io/projected/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-kube-api-access-k9tlg\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.609506 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.609431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-service-ca\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.609594 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.609515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-serving-cert\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.710709 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.710678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-config\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.710885 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.710729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-oauth-config\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.710885 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.710801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9tlg\" (UniqueName: \"kubernetes.io/projected/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-kube-api-access-k9tlg\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.710885 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.710823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-service-ca\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.710885 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.710855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-serving-cert\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.711101 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.710903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-oauth-serving-cert\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.711512 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.711476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-config\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.711626 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.711548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-oauth-serving-cert\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.711832 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.711788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-service-ca\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.713828 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.713807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-oauth-config\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.714022 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.714003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-serving-cert\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.718707 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.718683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9tlg\" (UniqueName: \"kubernetes.io/projected/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-kube-api-access-k9tlg\") pod \"console-69554f6858-9zslf\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:12.898626 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:12.898547 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:14.132950 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:14.132915 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69554f6858-9zslf"] Apr 16 14:55:14.138078 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:14.138029 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc435cf5_c1e8_403d_89ff_aeb4d6c30c15.slice/crio-d3005e868ddab8eb8eaf0525d6710f727e255f917d6cafba28094ab174cb4d10 WatchSource:0}: Error finding container d3005e868ddab8eb8eaf0525d6710f727e255f917d6cafba28094ab174cb4d10: Status 404 returned error can't find the container with id d3005e868ddab8eb8eaf0525d6710f727e255f917d6cafba28094ab174cb4d10 Apr 16 14:55:14.280048 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:14.280006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwb2b" event={"ID":"7ede1baa-a6e7-4b5e-8723-94a6c70847e3","Type":"ContainerStarted","Data":"5297ae09591fb6c041fbd97fb601942dec2473304dccbdea58c3c894699c130d"} Apr 16 14:55:14.281523 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:14.281492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69554f6858-9zslf" event={"ID":"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15","Type":"ContainerStarted","Data":"d3005e868ddab8eb8eaf0525d6710f727e255f917d6cafba28094ab174cb4d10"} Apr 16 14:55:14.283645 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:14.283597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n8878" event={"ID":"fb790f7c-1dc9-4bf8-a9e6-1054b49e346c","Type":"ContainerStarted","Data":"d416a1e30314016f59ad8e3cfceff7f1e22ee16332b7b9efa62a3e944de9ca63"} Apr 16 14:55:14.286932 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:14.286894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nbr9m" event={"ID":"47d8c674-a5dd-4235-aecc-2923a5a0809c","Type":"ContainerStarted","Data":"03bcd73bfdbb3b4bb3f75d2585b080cd4049475b2448b518691f5add79581165"} Apr 16 14:55:14.289077 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:14.289042 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" event={"ID":"f77a2c7c-9c50-400a-8982-b3e524240d5f","Type":"ContainerStarted","Data":"4c60e96c375020be4215fc398ded7496cb5bbae1957738be4f51d99a311baa29"} Apr 16 14:55:14.318386 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:14.318205 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-n8878" podStartSLOduration=129.000765725 podStartE2EDuration="2m12.318184835s" podCreationTimestamp="2026-04-16 14:53:02 +0000 UTC" firstStartedPulling="2026-04-16 14:55:10.659224351 +0000 UTC m=+161.577585534" lastFinishedPulling="2026-04-16 14:55:13.976643449 +0000 UTC m=+164.895004644" observedRunningTime="2026-04-16 14:55:14.297639719 +0000 UTC m=+165.216000926" watchObservedRunningTime="2026-04-16 14:55:14.318184835 +0000 UTC m=+165.236546041" Apr 16 14:55:14.319183 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:14.318902 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nbr9m" podStartSLOduration=1.201400969 podStartE2EDuration="5.318879653s" podCreationTimestamp="2026-04-16 14:55:09 +0000 UTC" firstStartedPulling="2026-04-16 14:55:09.859447008 +0000 UTC m=+160.777808192" lastFinishedPulling="2026-04-16 14:55:13.97692568 +0000 UTC m=+164.895286876" observedRunningTime="2026-04-16 14:55:14.31730778 +0000 UTC m=+165.235668986" watchObservedRunningTime="2026-04-16 14:55:14.318879653 +0000 UTC m=+165.237240858" Apr 16 14:55:14.334703 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:14.332472 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-wddbs" podStartSLOduration=33.200074333 podStartE2EDuration="36.332456353s" podCreationTimestamp="2026-04-16 14:54:38 +0000 UTC" firstStartedPulling="2026-04-16 14:55:10.844462059 +0000 UTC m=+161.762823243" lastFinishedPulling="2026-04-16 14:55:13.976844068 +0000 UTC m=+164.895205263" observedRunningTime="2026-04-16 14:55:14.331856978 +0000 UTC m=+165.250218184" watchObservedRunningTime="2026-04-16 14:55:14.332456353 +0000 UTC m=+165.250817568" Apr 16 14:55:15.295503 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:15.295308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwb2b" event={"ID":"7ede1baa-a6e7-4b5e-8723-94a6c70847e3","Type":"ContainerStarted","Data":"0db5d221654d65fca63870f04e0adbe0920022811c2c52a26cf7916c89e0d527"} Apr 16 14:55:15.296095 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:15.296073 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wwb2b" Apr 16 14:55:15.312464 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:15.312409 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wwb2b" podStartSLOduration=129.97480621 podStartE2EDuration="2m13.312394045s" podCreationTimestamp="2026-04-16 14:53:02 +0000 UTC" firstStartedPulling="2026-04-16 14:55:10.642855415 +0000 UTC m=+161.561216612" lastFinishedPulling="2026-04-16 14:55:13.980443249 +0000 UTC m=+164.898804447" observedRunningTime="2026-04-16 14:55:15.311183673 +0000 UTC m=+166.229544878" watchObservedRunningTime="2026-04-16 14:55:15.312394045 +0000 UTC m=+166.230755253" Apr 16 14:55:17.631867 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.631824 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-v7zjv"] Apr 16 14:55:17.647845 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.647810 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-v7zjv"] Apr 16 14:55:17.647987 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.647978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.650523 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.650497 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-84fzd\"" Apr 16 14:55:17.650666 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.650495 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:55:17.650666 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.650637 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:17.650666 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.650495 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:55:17.756757 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.756721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/55a6969b-5147-482a-879e-2d7c3ed30812-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.756953 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.756814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55a6969b-5147-482a-879e-2d7c3ed30812-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.756953 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.756863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55a6969b-5147-482a-879e-2d7c3ed30812-metrics-client-ca\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.756953 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.756899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47n94\" (UniqueName: \"kubernetes.io/projected/55a6969b-5147-482a-879e-2d7c3ed30812-kube-api-access-47n94\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.801716 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.801695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:55:17.857505 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.857479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55a6969b-5147-482a-879e-2d7c3ed30812-metrics-client-ca\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.857614 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.857527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47n94\" (UniqueName: \"kubernetes.io/projected/55a6969b-5147-482a-879e-2d7c3ed30812-kube-api-access-47n94\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.857614 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.857574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/55a6969b-5147-482a-879e-2d7c3ed30812-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.857731 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.857652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55a6969b-5147-482a-879e-2d7c3ed30812-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.859112 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.859081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55a6969b-5147-482a-879e-2d7c3ed30812-metrics-client-ca\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.860439 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.860413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55a6969b-5147-482a-879e-2d7c3ed30812-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.861137 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.861118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/55a6969b-5147-482a-879e-2d7c3ed30812-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.864812 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.864792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47n94\" (UniqueName: \"kubernetes.io/projected/55a6969b-5147-482a-879e-2d7c3ed30812-kube-api-access-47n94\") pod \"prometheus-operator-78f957474d-v7zjv\" (UID: \"55a6969b-5147-482a-879e-2d7c3ed30812\") " pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:17.958902 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:17.958868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" Apr 16 14:55:18.088671 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:18.088630 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-v7zjv"] Apr 16 14:55:18.092105 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:18.092072 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55a6969b_5147_482a_879e_2d7c3ed30812.slice/crio-686e32ad11f4902488cc83b576e5eed8ed8dc4ebed79b16f66ff4fcbab017026 WatchSource:0}: Error finding container 686e32ad11f4902488cc83b576e5eed8ed8dc4ebed79b16f66ff4fcbab017026: Status 404 returned error can't find the container with id 686e32ad11f4902488cc83b576e5eed8ed8dc4ebed79b16f66ff4fcbab017026 Apr 16 14:55:18.305832 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:18.305740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" event={"ID":"55a6969b-5147-482a-879e-2d7c3ed30812","Type":"ContainerStarted","Data":"686e32ad11f4902488cc83b576e5eed8ed8dc4ebed79b16f66ff4fcbab017026"} Apr 16 14:55:18.307342 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:18.307312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69554f6858-9zslf" event={"ID":"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15","Type":"ContainerStarted","Data":"f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258"} Apr 16 14:55:18.321740 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:18.321694 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69554f6858-9zslf" podStartSLOduration=2.673916383 podStartE2EDuration="6.321681236s" podCreationTimestamp="2026-04-16 14:55:12 +0000 UTC" firstStartedPulling="2026-04-16 14:55:14.140915313 +0000 UTC m=+165.059276503" lastFinishedPulling="2026-04-16 14:55:17.788680161 +0000 UTC m=+168.707041356" observedRunningTime="2026-04-16 14:55:18.32138529 +0000 UTC m=+169.239746499" watchObservedRunningTime="2026-04-16 14:55:18.321681236 +0000 UTC m=+169.240042440" Apr 16 14:55:19.313724 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:19.313666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" event={"ID":"55a6969b-5147-482a-879e-2d7c3ed30812","Type":"ContainerStarted","Data":"3e15a2cec711615a60d0ad97199f0f3fa43b779888f1777f871b1ce68c3bdbb8"} Apr 16 14:55:20.318836 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:20.318797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" event={"ID":"55a6969b-5147-482a-879e-2d7c3ed30812","Type":"ContainerStarted","Data":"7b83f8a29b344681d7ecad0f13351f9ce350f5252e4227e2639e69f3dcd8c3ed"} Apr 16 14:55:20.335686 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:20.335635 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-v7zjv" podStartSLOduration=2.230142565 podStartE2EDuration="3.335616966s" podCreationTimestamp="2026-04-16 14:55:17 +0000 UTC" firstStartedPulling="2026-04-16 14:55:18.094337938 +0000 UTC m=+169.012699121" lastFinishedPulling="2026-04-16 14:55:19.199812324 +0000 UTC m=+170.118173522" observedRunningTime="2026-04-16 14:55:20.33444793 +0000 UTC m=+171.252809158" watchObservedRunningTime="2026-04-16 14:55:20.335616966 +0000 UTC m=+171.253978170" Apr 16 14:55:21.162561 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.162525 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9d48cb95b-k6jns"] Apr 16 14:55:21.166998 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.166973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.172673 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.172647 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9d48cb95b-k6jns"] Apr 16 14:55:21.175201 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.175174 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:55:21.287814 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.287782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-oauth-serving-cert\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.287999 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.287824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-oauth-config\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.287999 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.287904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65gvn\" (UniqueName: \"kubernetes.io/projected/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-kube-api-access-65gvn\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.287999 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.287944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-trusted-ca-bundle\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.287999 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.287974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-service-ca\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.288209 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.288044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-config\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.288209 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.288084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-serving-cert\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.389050 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.389018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-oauth-serving-cert\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.389528 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.389071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-oauth-config\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.389528 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.389112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65gvn\" (UniqueName: \"kubernetes.io/projected/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-kube-api-access-65gvn\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.389528 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.389140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-trusted-ca-bundle\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.389528 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.389189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-service-ca\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.389528 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.389243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-config\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.389528 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.389276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-serving-cert\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.389907 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.389826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-oauth-serving-cert\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.390085 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.390019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-service-ca\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.390164 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.390146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-trusted-ca-bundle\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.390433 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.390410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-config\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.391849 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.391827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-serving-cert\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.391942 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.391913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-oauth-config\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.396713 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.396689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65gvn\" (UniqueName: \"kubernetes.io/projected/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-kube-api-access-65gvn\") pod \"console-9d48cb95b-k6jns\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.478954 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.478856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:21.985836 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.985801 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-p2rsl"] Apr 16 14:55:21.990308 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.990287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:21.992298 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.992274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:21.992486 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.992465 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-p6x8s\"" Apr 16 14:55:21.992575 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.992518 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:21.993083 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:21.993062 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:22.094996 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.094961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-tls\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.095167 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.095014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-wtmp\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.095167 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.095087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-accelerators-collector-config\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.095167 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.095110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-textfile\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.095327 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.095202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-sys\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.095327 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.095238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.095327 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.095322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-root\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.095481 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.095352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-metrics-client-ca\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.095481 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.095417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2nh2\" (UniqueName: \"kubernetes.io/projected/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-kube-api-access-f2nh2\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196037 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-accelerators-collector-config\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196211 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-textfile\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196211 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-sys\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196211 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196211 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-root\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196211 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-metrics-client-ca\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196566 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2nh2\" (UniqueName: \"kubernetes.io/projected/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-kube-api-access-f2nh2\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196566 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-tls\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196566 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196312 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-sys\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196566 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-wtmp\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196566 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196373 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-root\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196860 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-textfile\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196860 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-wtmp\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.196975 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.196920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-accelerators-collector-config\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.197127 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.197101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-metrics-client-ca\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.199334 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.199313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.199422 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.199313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-node-exporter-tls\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.203233 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.203214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2nh2\" (UniqueName: \"kubernetes.io/projected/14b82f4f-8a65-45b3-a4df-1eb8eecf50f5-kube-api-access-f2nh2\") pod \"node-exporter-p2rsl\" (UID: \"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5\") " pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.302156 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.302085 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p2rsl" Apr 16 14:55:22.899453 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.899413 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:22.899453 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.899461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:22.900910 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.900882 2576 patch_prober.go:28] interesting pod/console-69554f6858-9zslf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" start-of-body= Apr 16 14:55:22.901042 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:22.900936 2576 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-69554f6858-9zslf" podUID="fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" containerName="console" probeResult="failure" output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" Apr 16 14:55:24.048567 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.048532 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q"] Apr 16 14:55:24.053605 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.053579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.055854 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.055624 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 14:55:24.055854 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.055668 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 14:55:24.055854 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.055678 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5kg420scbqqjb\"" Apr 16 14:55:24.055854 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.055627 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 14:55:24.055854 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.055623 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 14:55:24.055854 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.055624 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 14:55:24.056238 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.056030 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-4n2gd\"" Apr 16 14:55:24.062461 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.062441 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q"] Apr 16 14:55:24.114022 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.113990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.114177 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.114038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-tls\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.114177 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.114145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-grpc-tls\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.114289 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.114184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kgg9\" (UniqueName: \"kubernetes.io/projected/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-kube-api-access-8kgg9\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.114289 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.114214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.114289 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.114234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.114398 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.114341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.114398 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.114365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-metrics-client-ca\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.215399 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.215363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-grpc-tls\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.215577 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.215413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kgg9\" (UniqueName: \"kubernetes.io/projected/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-kube-api-access-8kgg9\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.215577 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.215452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.215577 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.215484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.215577 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.215542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.215821 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.215613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-metrics-client-ca\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.215821 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.215674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.215821 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.215717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-tls\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.216628 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.216557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-metrics-client-ca\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.218534 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.218508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.218661 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.218540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.219023 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.218989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-grpc-tls\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.219195 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.219173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.219378 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.219356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.219527 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.219505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-secret-thanos-querier-tls\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.222895 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.222877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kgg9\" (UniqueName: \"kubernetes.io/projected/ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5-kube-api-access-8kgg9\") pod \"thanos-querier-7dc9c4d76b-8r88q\" (UID: \"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5\") " pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:24.366422 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:24.366318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:26.304862 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.304831 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wwb2b" Apr 16 14:55:26.790013 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:26.789975 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b82f4f_8a65_45b3_a4df_1eb8eecf50f5.slice/crio-82a29cea20e34b16f4c1b8e735a031fd58cef95efb0f7039ab81acbebce86e2c WatchSource:0}: Error finding container 82a29cea20e34b16f4c1b8e735a031fd58cef95efb0f7039ab81acbebce86e2c: Status 404 returned error can't find the container with id 82a29cea20e34b16f4c1b8e735a031fd58cef95efb0f7039ab81acbebce86e2c Apr 16 14:55:26.804105 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.804078 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69554f6858-9zslf"] Apr 16 14:55:26.829110 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.829072 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7944b76448-z86fg"] Apr 16 14:55:26.832848 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.832749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:26.839425 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.839380 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7944b76448-z86fg"] Apr 16 14:55:26.922577 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.922540 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9d48cb95b-k6jns"] Apr 16 14:55:26.942440 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.942409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-oauth-config\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:26.942569 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.942473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-console-config\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:26.942675 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.942626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-oauth-serving-cert\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:26.942675 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.942667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-serving-cert\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:26.942850 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.942825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfpqn\" (UniqueName: \"kubernetes.io/projected/658a7193-1208-487d-ad80-d76f26924e43-kube-api-access-sfpqn\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:26.942932 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.942877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-service-ca\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:26.942932 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.942902 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-trusted-ca-bundle\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:26.943041 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:26.943021 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q"] Apr 16 14:55:26.947557 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:26.947528 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce46f14a_8173_42ee_8a1b_3ee7d2abf2f5.slice/crio-d8230725a4bdacfde717cdd62db944f0964c9b16d64ea69c29faf5db9625c62b WatchSource:0}: Error finding container d8230725a4bdacfde717cdd62db944f0964c9b16d64ea69c29faf5db9625c62b: Status 404 returned error can't find the container with id d8230725a4bdacfde717cdd62db944f0964c9b16d64ea69c29faf5db9625c62b Apr 16 14:55:27.044260 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.044156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-serving-cert\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.044260 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.044214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfpqn\" (UniqueName: \"kubernetes.io/projected/658a7193-1208-487d-ad80-d76f26924e43-kube-api-access-sfpqn\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.044478 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.044379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-service-ca\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.044478 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.044427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-trusted-ca-bundle\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.044593 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.044495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-oauth-config\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.044593 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.044546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-console-config\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.044703 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.044621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-oauth-serving-cert\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.045192 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.045156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-service-ca\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.045376 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.045329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-oauth-serving-cert\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.045376 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.045354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-console-config\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.045559 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.045509 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-trusted-ca-bundle\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.046888 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.046868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-serving-cert\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.047097 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.047080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-oauth-config\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.050780 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.050745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfpqn\" (UniqueName: \"kubernetes.io/projected/658a7193-1208-487d-ad80-d76f26924e43-kube-api-access-sfpqn\") pod \"console-7944b76448-z86fg\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.153180 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.153144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:27.165085 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.164460 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-788bffd54-zmqgb"] Apr 16 14:55:27.170435 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.169978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.172228 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.172200 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 14:55:27.172808 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.172369 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 14:55:27.172808 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.172531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 14:55:27.172808 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.172570 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-btff2\"" Apr 16 14:55:27.173033 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.172871 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 14:55:27.173314 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.173180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 14:55:27.179449 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.178238 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-788bffd54-zmqgb"] Apr 16 14:55:27.179449 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.178394 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 14:55:27.246102 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.246067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63e43d42-aaea-4a5d-a411-ab6d342b83a4-serving-certs-ca-bundle\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.246271 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.246158 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-telemeter-client-tls\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.246271 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.246196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-federate-client-tls\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.246271 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.246226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63e43d42-aaea-4a5d-a411-ab6d342b83a4-metrics-client-ca\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.246429 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.246299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63e43d42-aaea-4a5d-a411-ab6d342b83a4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.246429 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.246327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg455\" (UniqueName: \"kubernetes.io/projected/63e43d42-aaea-4a5d-a411-ab6d342b83a4-kube-api-access-gg455\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.246429 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.246357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-secret-telemeter-client\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.246429 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.246393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.311321 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.311261 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7944b76448-z86fg"] Apr 16 14:55:27.314806 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:27.314725 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658a7193_1208_487d_ad80_d76f26924e43.slice/crio-791ece84a8e55624eed55db5cc7932e73b630510b6368134a9f6ff1ab10ea6ff WatchSource:0}: Error finding container 791ece84a8e55624eed55db5cc7932e73b630510b6368134a9f6ff1ab10ea6ff: Status 404 returned error can't find the container with id 791ece84a8e55624eed55db5cc7932e73b630510b6368134a9f6ff1ab10ea6ff Apr 16 14:55:27.344175 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.344139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9d48cb95b-k6jns" event={"ID":"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0","Type":"ContainerStarted","Data":"39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4"} Apr 16 14:55:27.344332 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.344188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9d48cb95b-k6jns" event={"ID":"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0","Type":"ContainerStarted","Data":"898551f0b57a26bba11b7c54aba4896ad9e9962a7eaa3caa401fa10665d66b6b"} Apr 16 14:55:27.345939 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.345906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-kpbxh" event={"ID":"3c7ac47a-551c-4399-8338-c3942554bedd","Type":"ContainerStarted","Data":"6643fd24ab71184deecf5c8f79edea3efc86c7995076e7537b7a7405047361e7"} Apr 16 14:55:27.346069 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.346047 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-kpbxh" Apr 16 14:55:27.347512 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.347011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-federate-client-tls\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.347512 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.347048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63e43d42-aaea-4a5d-a411-ab6d342b83a4-metrics-client-ca\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.347512 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.347096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63e43d42-aaea-4a5d-a411-ab6d342b83a4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.347512 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.347121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg455\" (UniqueName: \"kubernetes.io/projected/63e43d42-aaea-4a5d-a411-ab6d342b83a4-kube-api-access-gg455\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.347512 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.347149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-secret-telemeter-client\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.348138 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.347926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.348138 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.347996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63e43d42-aaea-4a5d-a411-ab6d342b83a4-serving-certs-ca-bundle\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.348246 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.348185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-telemeter-client-tls\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.348855 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.348799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p2rsl" event={"ID":"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5","Type":"ContainerStarted","Data":"82a29cea20e34b16f4c1b8e735a031fd58cef95efb0f7039ab81acbebce86e2c"} Apr 16 14:55:27.349084 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.349059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63e43d42-aaea-4a5d-a411-ab6d342b83a4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.349174 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.349118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63e43d42-aaea-4a5d-a411-ab6d342b83a4-metrics-client-ca\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.349806 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.349744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63e43d42-aaea-4a5d-a411-ab6d342b83a4-serving-certs-ca-bundle\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.350939 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.350897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-federate-client-tls\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.351179 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.351078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-secret-telemeter-client\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.351280 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.351220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.351381 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.351342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" event={"ID":"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5","Type":"ContainerStarted","Data":"d8230725a4bdacfde717cdd62db944f0964c9b16d64ea69c29faf5db9625c62b"} Apr 16 14:55:27.351701 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.351621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/63e43d42-aaea-4a5d-a411-ab6d342b83a4-telemeter-client-tls\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.352801 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.352745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7944b76448-z86fg" event={"ID":"658a7193-1208-487d-ad80-d76f26924e43","Type":"ContainerStarted","Data":"791ece84a8e55624eed55db5cc7932e73b630510b6368134a9f6ff1ab10ea6ff"} Apr 16 14:55:27.354616 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.354580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg455\" (UniqueName: \"kubernetes.io/projected/63e43d42-aaea-4a5d-a411-ab6d342b83a4-kube-api-access-gg455\") pod \"telemeter-client-788bffd54-zmqgb\" (UID: \"63e43d42-aaea-4a5d-a411-ab6d342b83a4\") " pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.356754 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.356734 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-kpbxh" Apr 16 14:55:27.359017 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.358906 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9d48cb95b-k6jns" podStartSLOduration=6.358891079 podStartE2EDuration="6.358891079s" podCreationTimestamp="2026-04-16 14:55:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:27.358515823 +0000 UTC m=+178.276877030" watchObservedRunningTime="2026-04-16 14:55:27.358891079 +0000 UTC m=+178.277252287" Apr 16 14:55:27.375640 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.375580 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-kpbxh" podStartSLOduration=1.358096427 podStartE2EDuration="18.375559951s" podCreationTimestamp="2026-04-16 14:55:09 +0000 UTC" firstStartedPulling="2026-04-16 14:55:09.87868712 +0000 UTC m=+160.797048316" lastFinishedPulling="2026-04-16 14:55:26.89615064 +0000 UTC m=+177.814511840" observedRunningTime="2026-04-16 14:55:27.374892946 +0000 UTC m=+178.293254152" watchObservedRunningTime="2026-04-16 14:55:27.375559951 +0000 UTC m=+178.293921157" Apr 16 14:55:27.487308 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.487250 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" Apr 16 14:55:27.672294 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:27.672061 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-788bffd54-zmqgb"] Apr 16 14:55:27.676571 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:27.676505 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e43d42_aaea_4a5d_a411_ab6d342b83a4.slice/crio-99119665687eae51082f337ef636ff454ff248b070cd6827a728b88cf4c4c652 WatchSource:0}: Error finding container 99119665687eae51082f337ef636ff454ff248b070cd6827a728b88cf4c4c652: Status 404 returned error can't find the container with id 99119665687eae51082f337ef636ff454ff248b070cd6827a728b88cf4c4c652 Apr 16 14:55:28.361541 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:28.360650 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7944b76448-z86fg" event={"ID":"658a7193-1208-487d-ad80-d76f26924e43","Type":"ContainerStarted","Data":"9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751"} Apr 16 14:55:28.366199 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:28.365795 2576 generic.go:358] "Generic (PLEG): container finished" podID="14b82f4f-8a65-45b3-a4df-1eb8eecf50f5" containerID="50e65e9823d292034f752b65a7faca01aaeeef4769e135c1f5130b414ebe76f1" exitCode=0 Apr 16 14:55:28.366199 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:28.365886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p2rsl" event={"ID":"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5","Type":"ContainerDied","Data":"50e65e9823d292034f752b65a7faca01aaeeef4769e135c1f5130b414ebe76f1"} Apr 16 14:55:28.369385 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:28.369316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" event={"ID":"63e43d42-aaea-4a5d-a411-ab6d342b83a4","Type":"ContainerStarted","Data":"99119665687eae51082f337ef636ff454ff248b070cd6827a728b88cf4c4c652"} Apr 16 14:55:28.378305 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:28.377424 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7944b76448-z86fg" podStartSLOduration=2.3774062320000002 podStartE2EDuration="2.377406232s" podCreationTimestamp="2026-04-16 14:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:28.376317824 +0000 UTC m=+179.294679032" watchObservedRunningTime="2026-04-16 14:55:28.377406232 +0000 UTC m=+179.295767439" Apr 16 14:55:29.375151 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:29.375114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p2rsl" event={"ID":"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5","Type":"ContainerStarted","Data":"fbd447667085f3cdb135f2910efab623f5f1c0491a00d39fea60ae255690c63d"} Apr 16 14:55:29.375151 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:29.375153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p2rsl" event={"ID":"14b82f4f-8a65-45b3-a4df-1eb8eecf50f5","Type":"ContainerStarted","Data":"411664de754690f9a0174e026083e012d7c3eec0b6a40e49cacb7af62f724d88"} Apr 16 14:55:29.394799 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:29.394655 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-p2rsl" podStartSLOduration=7.619259372 podStartE2EDuration="8.394634555s" podCreationTimestamp="2026-04-16 14:55:21 +0000 UTC" firstStartedPulling="2026-04-16 14:55:26.791922234 +0000 UTC m=+177.710283417" lastFinishedPulling="2026-04-16 14:55:27.567297401 +0000 UTC m=+178.485658600" observedRunningTime="2026-04-16 14:55:29.391858374 +0000 UTC m=+180.310219580" watchObservedRunningTime="2026-04-16 14:55:29.394634555 +0000 UTC m=+180.312995760" Apr 16 14:55:31.387599 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:31.387554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" event={"ID":"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5","Type":"ContainerStarted","Data":"7f607834af76dff6817e5e194cbfb8d685ac2fd0d0e522f178b7fc361fa25d14"} Apr 16 14:55:31.388078 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:31.387611 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" event={"ID":"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5","Type":"ContainerStarted","Data":"1b00117045441b5bb387164b85441f2cf5db8b0e945a9b6b5785a5360c53e6df"} Apr 16 14:55:31.388078 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:31.387627 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" event={"ID":"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5","Type":"ContainerStarted","Data":"056875b0476a1493dd41476f68adbe139ea0118f54d6e4c6fb33d7d0119625c1"} Apr 16 14:55:31.389214 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:31.389187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" event={"ID":"63e43d42-aaea-4a5d-a411-ab6d342b83a4","Type":"ContainerStarted","Data":"5a64b7c3063ce022a78e983fb63ce52a906801c1b96ad05595d2f735bc644ba4"} Apr 16 14:55:31.479722 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:31.479682 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:31.479898 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:31.479740 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:31.485495 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:31.485464 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:32.397372 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:32.397336 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:55:33.398808 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:33.398751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" event={"ID":"63e43d42-aaea-4a5d-a411-ab6d342b83a4","Type":"ContainerStarted","Data":"3cd562883755c8269c5d12f1b20f5f0154173c30bc8f0d09e8148ff5f2714684"} Apr 16 14:55:33.399305 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:33.399274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" event={"ID":"63e43d42-aaea-4a5d-a411-ab6d342b83a4","Type":"ContainerStarted","Data":"2e2ee1d074ffa7cf99c2d8e5f749c9e236fb65d56bb730c0229ea8edfc6cc5a4"} Apr 16 14:55:33.402167 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:33.402135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" event={"ID":"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5","Type":"ContainerStarted","Data":"56c2f342bf0e479a96df30e2b46485c145b9f44cf69cae43f2b530effba32427"} Apr 16 14:55:33.402295 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:33.402175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" event={"ID":"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5","Type":"ContainerStarted","Data":"5d9446b9704b3cf51b531ff334e43c40857a316d549c80420bdf49bf236acc53"} Apr 16 14:55:33.402295 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:33.402188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" event={"ID":"ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5","Type":"ContainerStarted","Data":"46a0730ab0203f7b8ee8805f26ecc29bcef5eb84600b809ff3e2d68031b17c33"} Apr 16 14:55:33.420178 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:33.420123 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-788bffd54-zmqgb" podStartSLOduration=1.649947877 podStartE2EDuration="6.420104538s" podCreationTimestamp="2026-04-16 14:55:27 +0000 UTC" firstStartedPulling="2026-04-16 14:55:27.679205581 +0000 UTC m=+178.597566770" lastFinishedPulling="2026-04-16 14:55:32.449362244 +0000 UTC m=+183.367723431" observedRunningTime="2026-04-16 14:55:33.417472893 +0000 UTC m=+184.335834110" watchObservedRunningTime="2026-04-16 14:55:33.420104538 +0000 UTC m=+184.338465744" Apr 16 14:55:33.436896 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:33.436845 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" podStartSLOduration=3.935176595 podStartE2EDuration="9.436827584s" podCreationTimestamp="2026-04-16 14:55:24 +0000 UTC" firstStartedPulling="2026-04-16 14:55:26.949457718 +0000 UTC m=+177.867818913" lastFinishedPulling="2026-04-16 14:55:32.451108705 +0000 UTC m=+183.369469902" observedRunningTime="2026-04-16 14:55:33.43532234 +0000 UTC m=+184.353683547" watchObservedRunningTime="2026-04-16 14:55:33.436827584 +0000 UTC m=+184.355188801" Apr 16 14:55:34.189517 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.189468 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7944b76448-z86fg"] Apr 16 14:55:34.222705 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.222670 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5956848ff4-k7qwb"] Apr 16 14:55:34.260170 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.260137 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5956848ff4-k7qwb"] Apr 16 14:55:34.260341 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.260270 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.407001 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.406964 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:34.420204 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.420168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-service-ca\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.420365 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.420226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pw57\" (UniqueName: \"kubernetes.io/projected/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-kube-api-access-4pw57\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.420365 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.420292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-config\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.420479 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.420374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-serving-cert\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.420479 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.420434 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-oauth-serving-cert\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.420581 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.420531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-trusted-ca-bundle\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.420581 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.420562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-oauth-config\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.521482 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.521394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-config\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.521482 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.521454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-serving-cert\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.521699 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.521597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-oauth-serving-cert\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.522861 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.522823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-trusted-ca-bundle\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.522996 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.522887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-oauth-config\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.525145 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.523879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-service-ca\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.525145 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.524101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pw57\" (UniqueName: \"kubernetes.io/projected/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-kube-api-access-4pw57\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.525145 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.524130 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-config\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.525427 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.525404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-service-ca\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.526156 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.526136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-serving-cert\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.526860 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.526740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-oauth-serving-cert\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.530429 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.530405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-oauth-config\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.530905 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.530883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-trusted-ca-bundle\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.532589 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.532530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pw57\" (UniqueName: \"kubernetes.io/projected/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-kube-api-access-4pw57\") pod \"console-5956848ff4-k7qwb\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.572604 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.572572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:34.705037 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:34.705006 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5956848ff4-k7qwb"] Apr 16 14:55:34.708906 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:55:34.708876 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb7f47ec_ac6f_4b85_b20b_fb1d94ef5cec.slice/crio-996df97f7d74b2625d8fa823ee7005532f61096e5a11db12734d2c6c30e5287e WatchSource:0}: Error finding container 996df97f7d74b2625d8fa823ee7005532f61096e5a11db12734d2c6c30e5287e: Status 404 returned error can't find the container with id 996df97f7d74b2625d8fa823ee7005532f61096e5a11db12734d2c6c30e5287e Apr 16 14:55:35.411569 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:35.411524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5956848ff4-k7qwb" event={"ID":"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec","Type":"ContainerStarted","Data":"9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8"} Apr 16 14:55:35.411569 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:35.411571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5956848ff4-k7qwb" event={"ID":"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec","Type":"ContainerStarted","Data":"996df97f7d74b2625d8fa823ee7005532f61096e5a11db12734d2c6c30e5287e"} Apr 16 14:55:35.419453 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:35.419429 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7dc9c4d76b-8r88q" Apr 16 14:55:35.431191 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:35.431146 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5956848ff4-k7qwb" podStartSLOduration=1.431131756 podStartE2EDuration="1.431131756s" podCreationTimestamp="2026-04-16 14:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:35.429605883 +0000 UTC m=+186.347967089" watchObservedRunningTime="2026-04-16 14:55:35.431131756 +0000 UTC m=+186.349492962" Apr 16 14:55:37.153352 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:37.153315 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:44.573393 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:44.573359 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:44.573895 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:44.573407 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:44.578203 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:44.578178 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:45.450484 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:45.450457 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:55:45.493574 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:45.493545 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9d48cb95b-k6jns"] Apr 16 14:55:51.824991 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:51.824918 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69554f6858-9zslf" podUID="fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" containerName="console" containerID="cri-o://f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258" gracePeriod=15 Apr 16 14:55:52.071607 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.071586 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69554f6858-9zslf_fc435cf5-c1e8-403d-89ff-aeb4d6c30c15/console/0.log" Apr 16 14:55:52.071718 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.071650 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:52.179087 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.179049 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-oauth-serving-cert\") pod \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " Apr 16 14:55:52.179250 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.179100 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9tlg\" (UniqueName: \"kubernetes.io/projected/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-kube-api-access-k9tlg\") pod \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " Apr 16 14:55:52.179250 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.179134 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-serving-cert\") pod \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " Apr 16 14:55:52.179250 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.179185 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-service-ca\") pod \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " Apr 16 14:55:52.179250 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.179217 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-config\") pod \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " Apr 16 14:55:52.179439 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.179268 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-oauth-config\") pod \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\" (UID: \"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15\") " Apr 16 14:55:52.179617 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.179589 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" (UID: "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:52.179745 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.179719 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-service-ca" (OuterVolumeSpecName: "service-ca") pod "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" (UID: "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:52.179839 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.179724 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-config" (OuterVolumeSpecName: "console-config") pod "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" (UID: "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:52.181448 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.181424 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" (UID: "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:52.181711 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.181692 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-kube-api-access-k9tlg" (OuterVolumeSpecName: "kube-api-access-k9tlg") pod "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" (UID: "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15"). InnerVolumeSpecName "kube-api-access-k9tlg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:52.181761 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.181735 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" (UID: "fc435cf5-c1e8-403d-89ff-aeb4d6c30c15"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:52.280995 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.280943 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-oauth-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.280995 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.280990 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k9tlg\" (UniqueName: \"kubernetes.io/projected/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-kube-api-access-k9tlg\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.280995 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.281001 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.280995 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.281011 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-service-ca\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.281249 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.281021 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.281249 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.281030 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15-console-oauth-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.469877 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.469804 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69554f6858-9zslf_fc435cf5-c1e8-403d-89ff-aeb4d6c30c15/console/0.log" Apr 16 14:55:52.469877 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.469841 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" containerID="f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258" exitCode=2 Apr 16 14:55:52.470059 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.469877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69554f6858-9zslf" event={"ID":"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15","Type":"ContainerDied","Data":"f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258"} Apr 16 14:55:52.470059 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.469912 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69554f6858-9zslf" Apr 16 14:55:52.470059 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.469925 2576 scope.go:117] "RemoveContainer" containerID="f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258" Apr 16 14:55:52.470059 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.469915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69554f6858-9zslf" event={"ID":"fc435cf5-c1e8-403d-89ff-aeb4d6c30c15","Type":"ContainerDied","Data":"d3005e868ddab8eb8eaf0525d6710f727e255f917d6cafba28094ab174cb4d10"} Apr 16 14:55:52.480363 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.478951 2576 scope.go:117] "RemoveContainer" containerID="f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258" Apr 16 14:55:52.480737 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:55:52.480713 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258\": container with ID starting with f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258 not found: ID does not exist" containerID="f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258" Apr 16 14:55:52.480873 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.480747 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258"} err="failed to get container status \"f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258\": rpc error: code = NotFound desc = could not find container \"f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258\": container with ID starting with f73d769666311a4f2cecb3dd5ca54dcd8710debdc7b930ec79d228f62b8ee258 not found: ID does not exist" Apr 16 14:55:52.489249 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.489216 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69554f6858-9zslf"] Apr 16 14:55:52.492363 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:52.492343 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69554f6858-9zslf"] Apr 16 14:55:53.805702 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:53.805669 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" path="/var/lib/kubelet/pods/fc435cf5-c1e8-403d-89ff-aeb4d6c30c15/volumes" Apr 16 14:55:58.489540 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:58.489503 2576 generic.go:358] "Generic (PLEG): container finished" podID="a682a0c3-02b2-44f9-b8bb-40ab354c9809" containerID="f74f0d638ae6da6143fe0f39a3a7b7aacdc28ea860bf74cf9e3b4016c3107c0b" exitCode=0 Apr 16 14:55:58.489953 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:58.489579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" event={"ID":"a682a0c3-02b2-44f9-b8bb-40ab354c9809","Type":"ContainerDied","Data":"f74f0d638ae6da6143fe0f39a3a7b7aacdc28ea860bf74cf9e3b4016c3107c0b"} Apr 16 14:55:58.489953 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:58.489911 2576 scope.go:117] "RemoveContainer" containerID="f74f0d638ae6da6143fe0f39a3a7b7aacdc28ea860bf74cf9e3b4016c3107c0b" Apr 16 14:55:59.213457 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.213397 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7944b76448-z86fg" podUID="658a7193-1208-487d-ad80-d76f26924e43" containerName="console" containerID="cri-o://9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751" gracePeriod=15 Apr 16 14:55:59.468411 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.468356 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7944b76448-z86fg_658a7193-1208-487d-ad80-d76f26924e43/console/0.log" Apr 16 14:55:59.468515 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.468420 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:59.495085 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.495053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-24n87" event={"ID":"a682a0c3-02b2-44f9-b8bb-40ab354c9809","Type":"ContainerStarted","Data":"73505710df7b450122a0381ac745e339c4d249931b45ae02c28d26ba87a592ec"} Apr 16 14:55:59.496295 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.496275 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7944b76448-z86fg_658a7193-1208-487d-ad80-d76f26924e43/console/0.log" Apr 16 14:55:59.496422 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.496316 2576 generic.go:358] "Generic (PLEG): container finished" podID="658a7193-1208-487d-ad80-d76f26924e43" containerID="9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751" exitCode=2 Apr 16 14:55:59.496422 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.496361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7944b76448-z86fg" event={"ID":"658a7193-1208-487d-ad80-d76f26924e43","Type":"ContainerDied","Data":"9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751"} Apr 16 14:55:59.496422 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.496377 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7944b76448-z86fg" Apr 16 14:55:59.496422 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.496390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7944b76448-z86fg" event={"ID":"658a7193-1208-487d-ad80-d76f26924e43","Type":"ContainerDied","Data":"791ece84a8e55624eed55db5cc7932e73b630510b6368134a9f6ff1ab10ea6ff"} Apr 16 14:55:59.496422 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.496410 2576 scope.go:117] "RemoveContainer" containerID="9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751" Apr 16 14:55:59.504998 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.504979 2576 scope.go:117] "RemoveContainer" containerID="9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751" Apr 16 14:55:59.505283 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:55:59.505261 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751\": container with ID starting with 9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751 not found: ID does not exist" containerID="9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751" Apr 16 14:55:59.505347 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.505293 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751"} err="failed to get container status \"9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751\": rpc error: code = NotFound desc = could not find container \"9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751\": container with ID starting with 9937c097c77b7cae34a57b6f01156a233d874b624fbd150cb922dbe6c4b93751 not found: ID does not exist" Apr 16 14:55:59.540338 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.540309 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-oauth-serving-cert\") pod \"658a7193-1208-487d-ad80-d76f26924e43\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " Apr 16 14:55:59.540496 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.540347 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-oauth-config\") pod \"658a7193-1208-487d-ad80-d76f26924e43\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " Apr 16 14:55:59.540496 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.540373 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-serving-cert\") pod \"658a7193-1208-487d-ad80-d76f26924e43\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " Apr 16 14:55:59.540609 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.540503 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-console-config\") pod \"658a7193-1208-487d-ad80-d76f26924e43\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " Apr 16 14:55:59.540609 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.540546 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfpqn\" (UniqueName: \"kubernetes.io/projected/658a7193-1208-487d-ad80-d76f26924e43-kube-api-access-sfpqn\") pod \"658a7193-1208-487d-ad80-d76f26924e43\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " Apr 16 14:55:59.540609 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.540595 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-service-ca\") pod \"658a7193-1208-487d-ad80-d76f26924e43\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " Apr 16 14:55:59.540749 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.540650 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-trusted-ca-bundle\") pod \"658a7193-1208-487d-ad80-d76f26924e43\" (UID: \"658a7193-1208-487d-ad80-d76f26924e43\") " Apr 16 14:55:59.541331 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.540817 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "658a7193-1208-487d-ad80-d76f26924e43" (UID: "658a7193-1208-487d-ad80-d76f26924e43"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:59.541331 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.541037 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-oauth-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:59.541331 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.541153 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "658a7193-1208-487d-ad80-d76f26924e43" (UID: "658a7193-1208-487d-ad80-d76f26924e43"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:59.541331 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.541163 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-service-ca" (OuterVolumeSpecName: "service-ca") pod "658a7193-1208-487d-ad80-d76f26924e43" (UID: "658a7193-1208-487d-ad80-d76f26924e43"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:59.541331 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.541262 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-console-config" (OuterVolumeSpecName: "console-config") pod "658a7193-1208-487d-ad80-d76f26924e43" (UID: "658a7193-1208-487d-ad80-d76f26924e43"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:59.542996 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.542967 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "658a7193-1208-487d-ad80-d76f26924e43" (UID: "658a7193-1208-487d-ad80-d76f26924e43"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:59.543100 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.543021 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "658a7193-1208-487d-ad80-d76f26924e43" (UID: "658a7193-1208-487d-ad80-d76f26924e43"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:59.543371 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.543345 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658a7193-1208-487d-ad80-d76f26924e43-kube-api-access-sfpqn" (OuterVolumeSpecName: "kube-api-access-sfpqn") pod "658a7193-1208-487d-ad80-d76f26924e43" (UID: "658a7193-1208-487d-ad80-d76f26924e43"). InnerVolumeSpecName "kube-api-access-sfpqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:59.641718 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.641686 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-console-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:59.641718 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.641715 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sfpqn\" (UniqueName: \"kubernetes.io/projected/658a7193-1208-487d-ad80-d76f26924e43-kube-api-access-sfpqn\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:59.641718 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.641726 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-service-ca\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:59.642016 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.641736 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658a7193-1208-487d-ad80-d76f26924e43-trusted-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:59.642016 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.641744 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-oauth-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:59.642016 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.641753 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/658a7193-1208-487d-ad80-d76f26924e43-console-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:55:59.816038 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.816011 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7944b76448-z86fg"] Apr 16 14:55:59.819352 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:55:59.819332 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7944b76448-z86fg"] Apr 16 14:56:01.805733 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:01.805698 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658a7193-1208-487d-ad80-d76f26924e43" path="/var/lib/kubelet/pods/658a7193-1208-487d-ad80-d76f26924e43/volumes" Apr 16 14:56:06.526641 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:06.526608 2576 generic.go:358] "Generic (PLEG): container finished" podID="2201f8ec-763d-4bde-9b0c-b412c0a2c025" containerID="7d5bdae6bf3c616078164ec9ed0275da0369966b9695b6d62b101bee00619d4a" exitCode=0 Apr 16 14:56:06.527028 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:06.526682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" event={"ID":"2201f8ec-763d-4bde-9b0c-b412c0a2c025","Type":"ContainerDied","Data":"7d5bdae6bf3c616078164ec9ed0275da0369966b9695b6d62b101bee00619d4a"} Apr 16 14:56:06.527068 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:06.527027 2576 scope.go:117] "RemoveContainer" containerID="7d5bdae6bf3c616078164ec9ed0275da0369966b9695b6d62b101bee00619d4a" Apr 16 14:56:07.374489 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:07.374461 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwb2b_7ede1baa-a6e7-4b5e-8723-94a6c70847e3/dns/0.log" Apr 16 14:56:07.381571 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:07.381535 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwb2b_7ede1baa-a6e7-4b5e-8723-94a6c70847e3/kube-rbac-proxy/0.log" Apr 16 14:56:07.531576 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:07.531537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-g8xfd" event={"ID":"2201f8ec-763d-4bde-9b0c-b412c0a2c025","Type":"ContainerStarted","Data":"1f639db1a1d2081ec660e58aa8d47204135aee3edf9c85205c6c5b32e134bf52"} Apr 16 14:56:07.932270 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:07.932246 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s9crx_3cebcaa8-957d-4f1e-b4f8-90637dae2bc0/dns-node-resolver/0.log" Apr 16 14:56:10.512947 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.512881 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9d48cb95b-k6jns" podUID="43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" containerName="console" containerID="cri-o://39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4" gracePeriod=15 Apr 16 14:56:10.761400 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.761377 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9d48cb95b-k6jns_43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0/console/0.log" Apr 16 14:56:10.761502 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.761434 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:56:10.838039 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.837966 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-service-ca\") pod \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " Apr 16 14:56:10.838039 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838007 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-trusted-ca-bundle\") pod \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " Apr 16 14:56:10.838262 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838049 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-oauth-config\") pod \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " Apr 16 14:56:10.838262 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838072 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65gvn\" (UniqueName: \"kubernetes.io/projected/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-kube-api-access-65gvn\") pod \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " Apr 16 14:56:10.838262 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838094 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-config\") pod \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " Apr 16 14:56:10.838262 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838121 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-serving-cert\") pod \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " Apr 16 14:56:10.838262 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838164 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-oauth-serving-cert\") pod \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\" (UID: \"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0\") " Apr 16 14:56:10.838505 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838338 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-service-ca" (OuterVolumeSpecName: "service-ca") pod "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" (UID: "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:10.838592 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838564 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-config" (OuterVolumeSpecName: "console-config") pod "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" (UID: "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:10.838689 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838645 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" (UID: "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:10.838689 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.838683 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" (UID: "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:10.840302 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.840275 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-kube-api-access-65gvn" (OuterVolumeSpecName: "kube-api-access-65gvn") pod "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" (UID: "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0"). InnerVolumeSpecName "kube-api-access-65gvn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:56:10.840382 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.840300 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" (UID: "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:10.840382 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.840334 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" (UID: "43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:10.939641 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.939613 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:56:10.939641 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.939636 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-oauth-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:56:10.939824 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.939657 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-service-ca\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:56:10.939824 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.939666 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-trusted-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:56:10.939824 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.939675 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-oauth-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:56:10.939824 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.939683 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65gvn\" (UniqueName: \"kubernetes.io/projected/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-kube-api-access-65gvn\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:56:10.939824 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:10.939692 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0-console-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:56:11.542906 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.542878 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9d48cb95b-k6jns_43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0/console/0.log" Apr 16 14:56:11.543375 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.542919 2576 generic.go:358] "Generic (PLEG): container finished" podID="43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" containerID="39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4" exitCode=2 Apr 16 14:56:11.543375 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.543008 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9d48cb95b-k6jns" Apr 16 14:56:11.543375 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.543009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9d48cb95b-k6jns" event={"ID":"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0","Type":"ContainerDied","Data":"39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4"} Apr 16 14:56:11.543375 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.543045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9d48cb95b-k6jns" event={"ID":"43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0","Type":"ContainerDied","Data":"898551f0b57a26bba11b7c54aba4896ad9e9962a7eaa3caa401fa10665d66b6b"} Apr 16 14:56:11.543375 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.543060 2576 scope.go:117] "RemoveContainer" containerID="39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4" Apr 16 14:56:11.551612 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.551598 2576 scope.go:117] "RemoveContainer" containerID="39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4" Apr 16 14:56:11.551877 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:56:11.551857 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4\": container with ID starting with 39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4 not found: ID does not exist" containerID="39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4" Apr 16 14:56:11.551949 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.551890 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4"} err="failed to get container status \"39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4\": rpc error: code = NotFound desc = could not find container \"39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4\": container with ID starting with 39e9102176f22bc914526813371fad6aa538f6f297f7faa5a929adba0181ccb4 not found: ID does not exist" Apr 16 14:56:11.561799 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.561778 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9d48cb95b-k6jns"] Apr 16 14:56:11.564550 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.564531 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9d48cb95b-k6jns"] Apr 16 14:56:11.805464 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:11.805395 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" path="/var/lib/kubelet/pods/43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0/volumes" Apr 16 14:56:19.574148 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:19.574117 2576 generic.go:358] "Generic (PLEG): container finished" podID="ff4ddc68-4c66-4144-b764-4cfde96015d7" containerID="b31434716d9645094c1366332de9387cb36b3cc341c3314535cc567a0b34d0ff" exitCode=0 Apr 16 14:56:19.574578 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:19.574189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" event={"ID":"ff4ddc68-4c66-4144-b764-4cfde96015d7","Type":"ContainerDied","Data":"b31434716d9645094c1366332de9387cb36b3cc341c3314535cc567a0b34d0ff"} Apr 16 14:56:19.574578 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:19.574523 2576 scope.go:117] "RemoveContainer" containerID="b31434716d9645094c1366332de9387cb36b3cc341c3314535cc567a0b34d0ff" Apr 16 14:56:20.579641 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:20.579607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-x98m6" event={"ID":"ff4ddc68-4c66-4144-b764-4cfde96015d7","Type":"ContainerStarted","Data":"4530b5e38eadd5c7f6f137ca84465f9c66e4d3c83acebd3294c0d28ea2176a0a"} Apr 16 14:56:41.597864 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:41.597811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:56:41.600220 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:41.600192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc2d28ab-f651-462e-ae85-98e9780905b0-metrics-certs\") pod \"network-metrics-daemon-7wx6z\" (UID: \"fc2d28ab-f651-462e-ae85-98e9780905b0\") " pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:56:41.803808 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:41.803782 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fb76v\"" Apr 16 14:56:41.813187 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:41.813167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wx6z" Apr 16 14:56:41.929014 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:41.928984 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7wx6z"] Apr 16 14:56:41.931745 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:56:41.931718 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc2d28ab_f651_462e_ae85_98e9780905b0.slice/crio-22623682a5910665103128b6147dd9d9efc96651710854d2bfcdf3475249449b WatchSource:0}: Error finding container 22623682a5910665103128b6147dd9d9efc96651710854d2bfcdf3475249449b: Status 404 returned error can't find the container with id 22623682a5910665103128b6147dd9d9efc96651710854d2bfcdf3475249449b Apr 16 14:56:42.641910 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:42.641875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7wx6z" event={"ID":"fc2d28ab-f651-462e-ae85-98e9780905b0","Type":"ContainerStarted","Data":"22623682a5910665103128b6147dd9d9efc96651710854d2bfcdf3475249449b"} Apr 16 14:56:43.646844 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:43.646801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7wx6z" event={"ID":"fc2d28ab-f651-462e-ae85-98e9780905b0","Type":"ContainerStarted","Data":"7b43238dbf7cb791da9e743983c5d1d3ded1143f4567f6ec0c97aeef40efcf58"} Apr 16 14:56:43.646844 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:43.646849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7wx6z" event={"ID":"fc2d28ab-f651-462e-ae85-98e9780905b0","Type":"ContainerStarted","Data":"98c56228c5297369cbb0a8f5e40848b05471e2e4eec3756b3d18cfbf62e2e904"} Apr 16 14:56:43.660859 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:43.660813 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7wx6z" podStartSLOduration=253.720979687 podStartE2EDuration="4m14.66080068s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:56:41.933494086 +0000 UTC m=+252.851855272" lastFinishedPulling="2026-04-16 14:56:42.873315078 +0000 UTC m=+253.791676265" observedRunningTime="2026-04-16 14:56:43.659809667 +0000 UTC m=+254.578170872" watchObservedRunningTime="2026-04-16 14:56:43.66080068 +0000 UTC m=+254.579161885" Apr 16 14:56:45.277802 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.277712 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cf9bf6966-28q8s"] Apr 16 14:56:45.278159 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.278023 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" containerName="console" Apr 16 14:56:45.278159 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.278035 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" containerName="console" Apr 16 14:56:45.278159 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.278051 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="658a7193-1208-487d-ad80-d76f26924e43" containerName="console" Apr 16 14:56:45.278159 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.278057 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="658a7193-1208-487d-ad80-d76f26924e43" containerName="console" Apr 16 14:56:45.278159 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.278070 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" containerName="console" Apr 16 14:56:45.278159 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.278075 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" containerName="console" Apr 16 14:56:45.278159 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.278130 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc435cf5-c1e8-403d-89ff-aeb4d6c30c15" containerName="console" Apr 16 14:56:45.278159 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.278139 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="658a7193-1208-487d-ad80-d76f26924e43" containerName="console" Apr 16 14:56:45.278159 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.278147 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="43fc7fcc-0fc7-4fa2-825c-0a21cf1f6dd0" containerName="console" Apr 16 14:56:45.280924 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.280906 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.289756 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.289730 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cf9bf6966-28q8s"] Apr 16 14:56:45.326972 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.326946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-config\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.327125 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.326977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-service-ca\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.327125 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.326993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-trusted-ca-bundle\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.327125 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.327011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n6gb\" (UniqueName: \"kubernetes.io/projected/2d27874c-0312-4e2d-a3a7-81e2c23585bb-kube-api-access-4n6gb\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.327125 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.327100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-serving-cert\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.327283 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.327160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-oauth-serving-cert\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.327283 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.327217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-oauth-config\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.428147 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.428116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-config\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.428313 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.428154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-service-ca\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.428313 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.428177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-trusted-ca-bundle\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.428313 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.428200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n6gb\" (UniqueName: \"kubernetes.io/projected/2d27874c-0312-4e2d-a3a7-81e2c23585bb-kube-api-access-4n6gb\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.428313 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.428229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-serving-cert\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.428313 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.428277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-oauth-serving-cert\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.428620 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.428321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-oauth-config\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.428981 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.428944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-service-ca\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.429096 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.429036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-config\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.429157 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.429143 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-oauth-serving-cert\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.429354 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.429334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-trusted-ca-bundle\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.430819 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.430788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-serving-cert\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.430917 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.430834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-oauth-config\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.434825 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.434799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n6gb\" (UniqueName: \"kubernetes.io/projected/2d27874c-0312-4e2d-a3a7-81e2c23585bb-kube-api-access-4n6gb\") pod \"console-6cf9bf6966-28q8s\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.590844 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.590731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:45.707497 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:45.707467 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cf9bf6966-28q8s"] Apr 16 14:56:45.710518 ip-10-0-130-140 kubenswrapper[2576]: W0416 14:56:45.710496 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d27874c_0312_4e2d_a3a7_81e2c23585bb.slice/crio-742ebe686db81b10047c9b0e501fab14fe10c4a68f1ec8918c28a39d0bdd9e80 WatchSource:0}: Error finding container 742ebe686db81b10047c9b0e501fab14fe10c4a68f1ec8918c28a39d0bdd9e80: Status 404 returned error can't find the container with id 742ebe686db81b10047c9b0e501fab14fe10c4a68f1ec8918c28a39d0bdd9e80 Apr 16 14:56:46.657618 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:46.657576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf9bf6966-28q8s" event={"ID":"2d27874c-0312-4e2d-a3a7-81e2c23585bb","Type":"ContainerStarted","Data":"ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6"} Apr 16 14:56:46.657618 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:46.657614 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf9bf6966-28q8s" event={"ID":"2d27874c-0312-4e2d-a3a7-81e2c23585bb","Type":"ContainerStarted","Data":"742ebe686db81b10047c9b0e501fab14fe10c4a68f1ec8918c28a39d0bdd9e80"} Apr 16 14:56:46.672617 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:46.672578 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cf9bf6966-28q8s" podStartSLOduration=1.672565144 podStartE2EDuration="1.672565144s" podCreationTimestamp="2026-04-16 14:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:56:46.672034249 +0000 UTC m=+257.590395465" watchObservedRunningTime="2026-04-16 14:56:46.672565144 +0000 UTC m=+257.590926343" Apr 16 14:56:55.591742 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:55.591706 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:55.591742 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:55.591746 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:55.596474 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:55.596450 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:55.686811 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:55.686789 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 14:56:55.730221 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:56:55.730193 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5956848ff4-k7qwb"] Apr 16 14:57:20.749375 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:20.749313 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5956848ff4-k7qwb" podUID="cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" containerName="console" containerID="cri-o://9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8" gracePeriod=15 Apr 16 14:57:20.998213 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:20.998190 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5956848ff4-k7qwb_cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec/console/0.log" Apr 16 14:57:20.998339 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:20.998255 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:57:21.029964 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.029884 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-oauth-config\") pod \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " Apr 16 14:57:21.029964 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.029923 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pw57\" (UniqueName: \"kubernetes.io/projected/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-kube-api-access-4pw57\") pod \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " Apr 16 14:57:21.029964 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.029942 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-trusted-ca-bundle\") pod \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " Apr 16 14:57:21.030242 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.029971 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-oauth-serving-cert\") pod \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " Apr 16 14:57:21.030242 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.030017 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-config\") pod \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " Apr 16 14:57:21.030242 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.030068 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-service-ca\") pod \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " Apr 16 14:57:21.030242 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.030116 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-serving-cert\") pod \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\" (UID: \"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec\") " Apr 16 14:57:21.030597 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.030532 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" (UID: "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:21.030597 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.030535 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-config" (OuterVolumeSpecName: "console-config") pod "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" (UID: "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:21.030728 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.030601 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-service-ca" (OuterVolumeSpecName: "service-ca") pod "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" (UID: "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:21.030831 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.030804 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" (UID: "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:21.032138 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.032112 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" (UID: "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:21.032241 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.032155 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-kube-api-access-4pw57" (OuterVolumeSpecName: "kube-api-access-4pw57") pod "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" (UID: "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec"). InnerVolumeSpecName "kube-api-access-4pw57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:57:21.032241 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.032182 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" (UID: "cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:21.131174 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.131143 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:57:21.131174 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.131170 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-service-ca\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:57:21.131174 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.131180 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:57:21.131385 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.131190 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-console-oauth-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:57:21.131385 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.131199 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4pw57\" (UniqueName: \"kubernetes.io/projected/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-kube-api-access-4pw57\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:57:21.131385 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.131209 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-trusted-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:57:21.131385 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.131217 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec-oauth-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 14:57:21.761291 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.761266 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5956848ff4-k7qwb_cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec/console/0.log" Apr 16 14:57:21.761676 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.761305 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" containerID="9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8" exitCode=2 Apr 16 14:57:21.761676 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.761352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5956848ff4-k7qwb" event={"ID":"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec","Type":"ContainerDied","Data":"9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8"} Apr 16 14:57:21.761676 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.761368 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5956848ff4-k7qwb" Apr 16 14:57:21.761676 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.761383 2576 scope.go:117] "RemoveContainer" containerID="9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8" Apr 16 14:57:21.761676 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.761373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5956848ff4-k7qwb" event={"ID":"cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec","Type":"ContainerDied","Data":"996df97f7d74b2625d8fa823ee7005532f61096e5a11db12734d2c6c30e5287e"} Apr 16 14:57:21.769454 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.769438 2576 scope.go:117] "RemoveContainer" containerID="9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8" Apr 16 14:57:21.769676 ip-10-0-130-140 kubenswrapper[2576]: E0416 14:57:21.769657 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8\": container with ID starting with 9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8 not found: ID does not exist" containerID="9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8" Apr 16 14:57:21.769728 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.769684 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8"} err="failed to get container status \"9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8\": rpc error: code = NotFound desc = could not find container \"9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8\": container with ID starting with 9efa45047c9ba4180b8fb4b58c35e09fa1e44e7955aea04ec890e60ba263d5f8 not found: ID does not exist" Apr 16 14:57:21.778741 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.778720 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5956848ff4-k7qwb"] Apr 16 14:57:21.782192 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.782175 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5956848ff4-k7qwb"] Apr 16 14:57:21.805983 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:21.805961 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" path="/var/lib/kubelet/pods/cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec/volumes" Apr 16 14:57:29.642100 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:29.642075 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 14:57:29.642654 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:29.642633 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 14:57:29.644884 ip-10-0-130-140 kubenswrapper[2576]: I0416 14:57:29.644863 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:00:41.526039 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.526004 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-688c748945-2xq7q"] Apr 16 15:00:41.526523 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.526295 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" containerName="console" Apr 16 15:00:41.526523 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.526306 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" containerName="console" Apr 16 15:00:41.526523 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.526357 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb7f47ec-ac6f-4b85-b20b-fb1d94ef5cec" containerName="console" Apr 16 15:00:41.529118 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.529103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.542487 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.542461 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-688c748945-2xq7q"] Apr 16 15:00:41.597536 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.597507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-console-config\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.597687 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.597543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xg77\" (UniqueName: \"kubernetes.io/projected/3ba90034-840a-4eda-afee-61e3998fa8d1-kube-api-access-4xg77\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.597687 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.597571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ba90034-840a-4eda-afee-61e3998fa8d1-console-oauth-config\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.597687 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.597635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-trusted-ca-bundle\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.597687 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.597668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-oauth-serving-cert\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.597687 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.597686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-service-ca\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.597870 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.597721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ba90034-840a-4eda-afee-61e3998fa8d1-console-serving-cert\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.698088 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.698044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ba90034-840a-4eda-afee-61e3998fa8d1-console-serving-cert\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.698255 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.698101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-console-config\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.698255 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.698124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xg77\" (UniqueName: \"kubernetes.io/projected/3ba90034-840a-4eda-afee-61e3998fa8d1-kube-api-access-4xg77\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.698255 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.698151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ba90034-840a-4eda-afee-61e3998fa8d1-console-oauth-config\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.698255 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.698172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-trusted-ca-bundle\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.698255 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.698192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-oauth-serving-cert\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.698484 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.698435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-service-ca\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.698971 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.698947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-console-config\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.699059 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.698991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-oauth-serving-cert\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.699059 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.699021 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-service-ca\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.699125 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.699083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ba90034-840a-4eda-afee-61e3998fa8d1-trusted-ca-bundle\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.701029 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.701009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ba90034-840a-4eda-afee-61e3998fa8d1-console-oauth-config\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.701175 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.701156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ba90034-840a-4eda-afee-61e3998fa8d1-console-serving-cert\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.706012 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.705991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xg77\" (UniqueName: \"kubernetes.io/projected/3ba90034-840a-4eda-afee-61e3998fa8d1-kube-api-access-4xg77\") pod \"console-688c748945-2xq7q\" (UID: \"3ba90034-840a-4eda-afee-61e3998fa8d1\") " pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.838721 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.838653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:41.956577 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.956540 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-688c748945-2xq7q"] Apr 16 15:00:41.959443 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:00:41.959417 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ba90034_840a_4eda_afee_61e3998fa8d1.slice/crio-5b522de6344e6e92709e7e2dd354ec525aceb03fb7ecca3620abb963a56d7628 WatchSource:0}: Error finding container 5b522de6344e6e92709e7e2dd354ec525aceb03fb7ecca3620abb963a56d7628: Status 404 returned error can't find the container with id 5b522de6344e6e92709e7e2dd354ec525aceb03fb7ecca3620abb963a56d7628 Apr 16 15:00:41.961215 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:41.961199 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:00:42.317732 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:42.317696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-688c748945-2xq7q" event={"ID":"3ba90034-840a-4eda-afee-61e3998fa8d1","Type":"ContainerStarted","Data":"3d1aca2d11d91235ab3c7644c7bd2ebf4fa0ea9e7fc45db4945869cd5691b6e5"} Apr 16 15:00:42.317732 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:42.317732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-688c748945-2xq7q" event={"ID":"3ba90034-840a-4eda-afee-61e3998fa8d1","Type":"ContainerStarted","Data":"5b522de6344e6e92709e7e2dd354ec525aceb03fb7ecca3620abb963a56d7628"} Apr 16 15:00:42.333927 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:42.333883 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-688c748945-2xq7q" podStartSLOduration=1.33386716 podStartE2EDuration="1.33386716s" podCreationTimestamp="2026-04-16 15:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:00:42.331327677 +0000 UTC m=+493.249688883" watchObservedRunningTime="2026-04-16 15:00:42.33386716 +0000 UTC m=+493.252228366" Apr 16 15:00:51.838874 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:51.838793 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:51.838874 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:51.838831 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:51.843239 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:51.843217 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:52.350429 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:52.350403 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-688c748945-2xq7q" Apr 16 15:00:52.390943 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:00:52.390915 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cf9bf6966-28q8s"] Apr 16 15:01:17.410358 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.410281 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cf9bf6966-28q8s" podUID="2d27874c-0312-4e2d-a3a7-81e2c23585bb" containerName="console" containerID="cri-o://ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6" gracePeriod=15 Apr 16 15:01:17.648456 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.648436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cf9bf6966-28q8s_2d27874c-0312-4e2d-a3a7-81e2c23585bb/console/0.log" Apr 16 15:01:17.648554 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.648497 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 15:01:17.799420 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799339 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-serving-cert\") pod \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " Apr 16 15:01:17.799420 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799396 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-service-ca\") pod \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " Apr 16 15:01:17.799420 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799423 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-oauth-config\") pod \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " Apr 16 15:01:17.799675 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799441 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n6gb\" (UniqueName: \"kubernetes.io/projected/2d27874c-0312-4e2d-a3a7-81e2c23585bb-kube-api-access-4n6gb\") pod \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " Apr 16 15:01:17.799675 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799463 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-trusted-ca-bundle\") pod \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " Apr 16 15:01:17.799675 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799499 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-oauth-serving-cert\") pod \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " Apr 16 15:01:17.799675 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799529 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-config\") pod \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\" (UID: \"2d27874c-0312-4e2d-a3a7-81e2c23585bb\") " Apr 16 15:01:17.799984 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799953 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-service-ca" (OuterVolumeSpecName: "service-ca") pod "2d27874c-0312-4e2d-a3a7-81e2c23585bb" (UID: "2d27874c-0312-4e2d-a3a7-81e2c23585bb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:01:17.800044 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799988 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2d27874c-0312-4e2d-a3a7-81e2c23585bb" (UID: "2d27874c-0312-4e2d-a3a7-81e2c23585bb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:01:17.800044 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.799998 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2d27874c-0312-4e2d-a3a7-81e2c23585bb" (UID: "2d27874c-0312-4e2d-a3a7-81e2c23585bb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:01:17.800114 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.800055 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-config" (OuterVolumeSpecName: "console-config") pod "2d27874c-0312-4e2d-a3a7-81e2c23585bb" (UID: "2d27874c-0312-4e2d-a3a7-81e2c23585bb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:01:17.801625 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.801603 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d27874c-0312-4e2d-a3a7-81e2c23585bb-kube-api-access-4n6gb" (OuterVolumeSpecName: "kube-api-access-4n6gb") pod "2d27874c-0312-4e2d-a3a7-81e2c23585bb" (UID: "2d27874c-0312-4e2d-a3a7-81e2c23585bb"). InnerVolumeSpecName "kube-api-access-4n6gb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:01:17.801977 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.801949 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2d27874c-0312-4e2d-a3a7-81e2c23585bb" (UID: "2d27874c-0312-4e2d-a3a7-81e2c23585bb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:01:17.801977 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.801967 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2d27874c-0312-4e2d-a3a7-81e2c23585bb" (UID: "2d27874c-0312-4e2d-a3a7-81e2c23585bb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:01:17.900311 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.900285 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:01:17.900311 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.900309 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-service-ca\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:01:17.900474 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.900320 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-oauth-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:01:17.900474 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.900329 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4n6gb\" (UniqueName: \"kubernetes.io/projected/2d27874c-0312-4e2d-a3a7-81e2c23585bb-kube-api-access-4n6gb\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:01:17.900474 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.900339 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-trusted-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:01:17.900474 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.900348 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-oauth-serving-cert\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:01:17.900474 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:17.900357 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d27874c-0312-4e2d-a3a7-81e2c23585bb-console-config\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:01:18.423327 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.423303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cf9bf6966-28q8s_2d27874c-0312-4e2d-a3a7-81e2c23585bb/console/0.log" Apr 16 15:01:18.423750 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.423341 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d27874c-0312-4e2d-a3a7-81e2c23585bb" containerID="ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6" exitCode=2 Apr 16 15:01:18.423750 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.423409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf9bf6966-28q8s" event={"ID":"2d27874c-0312-4e2d-a3a7-81e2c23585bb","Type":"ContainerDied","Data":"ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6"} Apr 16 15:01:18.423750 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.423415 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf9bf6966-28q8s" Apr 16 15:01:18.423750 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.423434 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf9bf6966-28q8s" event={"ID":"2d27874c-0312-4e2d-a3a7-81e2c23585bb","Type":"ContainerDied","Data":"742ebe686db81b10047c9b0e501fab14fe10c4a68f1ec8918c28a39d0bdd9e80"} Apr 16 15:01:18.423750 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.423449 2576 scope.go:117] "RemoveContainer" containerID="ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6" Apr 16 15:01:18.431374 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.431354 2576 scope.go:117] "RemoveContainer" containerID="ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6" Apr 16 15:01:18.431653 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:01:18.431634 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6\": container with ID starting with ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6 not found: ID does not exist" containerID="ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6" Apr 16 15:01:18.431730 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.431660 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6"} err="failed to get container status \"ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6\": rpc error: code = NotFound desc = could not find container \"ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6\": container with ID starting with ed1164d0eeec7093e264b59f2d29a7d776d9aab06feba77f5551effca71737c6 not found: ID does not exist" Apr 16 15:01:18.441346 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.438074 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cf9bf6966-28q8s"] Apr 16 15:01:18.444144 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:18.444121 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cf9bf6966-28q8s"] Apr 16 15:01:19.805202 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:19.805160 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d27874c-0312-4e2d-a3a7-81e2c23585bb" path="/var/lib/kubelet/pods/2d27874c-0312-4e2d-a3a7-81e2c23585bb/volumes" Apr 16 15:01:38.315181 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.315148 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-x59c6"] Apr 16 15:01:38.315690 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.315536 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d27874c-0312-4e2d-a3a7-81e2c23585bb" containerName="console" Apr 16 15:01:38.315690 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.315551 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d27874c-0312-4e2d-a3a7-81e2c23585bb" containerName="console" Apr 16 15:01:38.315690 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.315634 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d27874c-0312-4e2d-a3a7-81e2c23585bb" containerName="console" Apr 16 15:01:38.319881 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.319862 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.321583 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.321566 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 15:01:38.324341 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.324319 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x59c6"] Apr 16 15:01:38.351855 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.351827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/321149cb-c540-42e4-89ca-d79124957dca-dbus\") pod \"global-pull-secret-syncer-x59c6\" (UID: \"321149cb-c540-42e4-89ca-d79124957dca\") " pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.351986 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.351887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/321149cb-c540-42e4-89ca-d79124957dca-kubelet-config\") pod \"global-pull-secret-syncer-x59c6\" (UID: \"321149cb-c540-42e4-89ca-d79124957dca\") " pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.351986 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.351907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/321149cb-c540-42e4-89ca-d79124957dca-original-pull-secret\") pod \"global-pull-secret-syncer-x59c6\" (UID: \"321149cb-c540-42e4-89ca-d79124957dca\") " pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.452520 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.452494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/321149cb-c540-42e4-89ca-d79124957dca-dbus\") pod \"global-pull-secret-syncer-x59c6\" (UID: \"321149cb-c540-42e4-89ca-d79124957dca\") " pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.452648 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.452558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/321149cb-c540-42e4-89ca-d79124957dca-kubelet-config\") pod \"global-pull-secret-syncer-x59c6\" (UID: \"321149cb-c540-42e4-89ca-d79124957dca\") " pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.452648 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.452580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/321149cb-c540-42e4-89ca-d79124957dca-original-pull-secret\") pod \"global-pull-secret-syncer-x59c6\" (UID: \"321149cb-c540-42e4-89ca-d79124957dca\") " pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.452756 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.452677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/321149cb-c540-42e4-89ca-d79124957dca-kubelet-config\") pod \"global-pull-secret-syncer-x59c6\" (UID: \"321149cb-c540-42e4-89ca-d79124957dca\") " pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.452756 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.452702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/321149cb-c540-42e4-89ca-d79124957dca-dbus\") pod \"global-pull-secret-syncer-x59c6\" (UID: \"321149cb-c540-42e4-89ca-d79124957dca\") " pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.454800 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.454785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/321149cb-c540-42e4-89ca-d79124957dca-original-pull-secret\") pod \"global-pull-secret-syncer-x59c6\" (UID: \"321149cb-c540-42e4-89ca-d79124957dca\") " pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.629158 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.629064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x59c6" Apr 16 15:01:38.744556 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:38.744469 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x59c6"] Apr 16 15:01:38.747149 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:01:38.747126 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod321149cb_c540_42e4_89ca_d79124957dca.slice/crio-d1086e0902daba093022e3210e046eb1533d7374cd14fe5141f06628c743658e WatchSource:0}: Error finding container d1086e0902daba093022e3210e046eb1533d7374cd14fe5141f06628c743658e: Status 404 returned error can't find the container with id d1086e0902daba093022e3210e046eb1533d7374cd14fe5141f06628c743658e Apr 16 15:01:39.485587 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:39.485550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x59c6" event={"ID":"321149cb-c540-42e4-89ca-d79124957dca","Type":"ContainerStarted","Data":"d1086e0902daba093022e3210e046eb1533d7374cd14fe5141f06628c743658e"} Apr 16 15:01:43.499108 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:43.499068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x59c6" event={"ID":"321149cb-c540-42e4-89ca-d79124957dca","Type":"ContainerStarted","Data":"02e9f96e16226803dc0cc11f1babd9011096d61e635331aa3e58ff904c79107e"} Apr 16 15:01:43.512257 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:43.512137 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x59c6" podStartSLOduration=1.312105075 podStartE2EDuration="5.512117356s" podCreationTimestamp="2026-04-16 15:01:38 +0000 UTC" firstStartedPulling="2026-04-16 15:01:38.74868725 +0000 UTC m=+549.667048446" lastFinishedPulling="2026-04-16 15:01:42.948699532 +0000 UTC m=+553.867060727" observedRunningTime="2026-04-16 15:01:43.511969042 +0000 UTC m=+554.430330248" watchObservedRunningTime="2026-04-16 15:01:43.512117356 +0000 UTC m=+554.430478562" Apr 16 15:01:59.405244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.405211 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk"] Apr 16 15:01:59.408555 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.408539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.410610 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.410589 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6qz8q\"" Apr 16 15:01:59.410741 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.410589 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:01:59.411007 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.410989 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:01:59.415128 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.415104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk"] Apr 16 15:01:59.508280 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.508235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.508280 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.508288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.508484 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.508318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mdbc\" (UniqueName: \"kubernetes.io/projected/ee91099e-aa1c-4e71-a04c-dc3607d60301-kube-api-access-9mdbc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.608935 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.608872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.608935 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.608941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.609189 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.608961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mdbc\" (UniqueName: \"kubernetes.io/projected/ee91099e-aa1c-4e71-a04c-dc3607d60301-kube-api-access-9mdbc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.609359 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.609335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.609428 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.609350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.616185 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.616164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mdbc\" (UniqueName: \"kubernetes.io/projected/ee91099e-aa1c-4e71-a04c-dc3607d60301-kube-api-access-9mdbc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.718138 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.718048 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:01:59.838967 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:01:59.838945 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk"] Apr 16 15:01:59.841459 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:01:59.841427 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee91099e_aa1c_4e71_a04c_dc3607d60301.slice/crio-2d37695be458040522a66028eb473256f81c363896a7c4c9395327a1f490e313 WatchSource:0}: Error finding container 2d37695be458040522a66028eb473256f81c363896a7c4c9395327a1f490e313: Status 404 returned error can't find the container with id 2d37695be458040522a66028eb473256f81c363896a7c4c9395327a1f490e313 Apr 16 15:02:00.550623 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:00.550578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" event={"ID":"ee91099e-aa1c-4e71-a04c-dc3607d60301","Type":"ContainerStarted","Data":"2d37695be458040522a66028eb473256f81c363896a7c4c9395327a1f490e313"} Apr 16 15:02:05.567659 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:05.567622 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerID="7c02a9463410b664561c090cd40d20de3fae922661dfdbbe59178357b6e5ac81" exitCode=0 Apr 16 15:02:05.568059 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:05.567668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" event={"ID":"ee91099e-aa1c-4e71-a04c-dc3607d60301","Type":"ContainerDied","Data":"7c02a9463410b664561c090cd40d20de3fae922661dfdbbe59178357b6e5ac81"} Apr 16 15:02:07.574453 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:07.574419 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerID="13e05901aced2f0855fbfe87e57d4e8f40eed80fd1bf4a90ced6106eed2ffd36" exitCode=0 Apr 16 15:02:07.574821 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:07.574496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" event={"ID":"ee91099e-aa1c-4e71-a04c-dc3607d60301","Type":"ContainerDied","Data":"13e05901aced2f0855fbfe87e57d4e8f40eed80fd1bf4a90ced6106eed2ffd36"} Apr 16 15:02:13.595744 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:13.595705 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerID="763b0b4ac2e4c46d02cf851ff1df0ccc8fcb6c50ce16bff38727d710ceae77c6" exitCode=0 Apr 16 15:02:13.596136 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:13.595759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" event={"ID":"ee91099e-aa1c-4e71-a04c-dc3607d60301","Type":"ContainerDied","Data":"763b0b4ac2e4c46d02cf851ff1df0ccc8fcb6c50ce16bff38727d710ceae77c6"} Apr 16 15:02:14.714840 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.714819 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:02:14.846042 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.846015 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-util\") pod \"ee91099e-aa1c-4e71-a04c-dc3607d60301\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " Apr 16 15:02:14.846230 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.846068 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-bundle\") pod \"ee91099e-aa1c-4e71-a04c-dc3607d60301\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " Apr 16 15:02:14.846230 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.846130 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mdbc\" (UniqueName: \"kubernetes.io/projected/ee91099e-aa1c-4e71-a04c-dc3607d60301-kube-api-access-9mdbc\") pod \"ee91099e-aa1c-4e71-a04c-dc3607d60301\" (UID: \"ee91099e-aa1c-4e71-a04c-dc3607d60301\") " Apr 16 15:02:14.846723 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.846688 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-bundle" (OuterVolumeSpecName: "bundle") pod "ee91099e-aa1c-4e71-a04c-dc3607d60301" (UID: "ee91099e-aa1c-4e71-a04c-dc3607d60301"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:02:14.848338 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.848308 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee91099e-aa1c-4e71-a04c-dc3607d60301-kube-api-access-9mdbc" (OuterVolumeSpecName: "kube-api-access-9mdbc") pod "ee91099e-aa1c-4e71-a04c-dc3607d60301" (UID: "ee91099e-aa1c-4e71-a04c-dc3607d60301"). InnerVolumeSpecName "kube-api-access-9mdbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:02:14.850864 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.850823 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-util" (OuterVolumeSpecName: "util") pod "ee91099e-aa1c-4e71-a04c-dc3607d60301" (UID: "ee91099e-aa1c-4e71-a04c-dc3607d60301"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:02:14.947295 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.947263 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-util\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:02:14.947295 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.947292 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee91099e-aa1c-4e71-a04c-dc3607d60301-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:02:14.947514 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:14.947305 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9mdbc\" (UniqueName: \"kubernetes.io/projected/ee91099e-aa1c-4e71-a04c-dc3607d60301-kube-api-access-9mdbc\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:02:15.602734 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:15.602701 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" event={"ID":"ee91099e-aa1c-4e71-a04c-dc3607d60301","Type":"ContainerDied","Data":"2d37695be458040522a66028eb473256f81c363896a7c4c9395327a1f490e313"} Apr 16 15:02:15.602734 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:15.602737 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d37695be458040522a66028eb473256f81c363896a7c4c9395327a1f490e313" Apr 16 15:02:15.602950 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:15.602709 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqlrbk" Apr 16 15:02:20.766998 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.766962 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq"] Apr 16 15:02:20.767407 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.767270 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerName="extract" Apr 16 15:02:20.767407 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.767282 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerName="extract" Apr 16 15:02:20.767407 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.767291 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerName="util" Apr 16 15:02:20.767407 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.767297 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerName="util" Apr 16 15:02:20.767407 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.767308 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerName="pull" Apr 16 15:02:20.767407 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.767313 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerName="pull" Apr 16 15:02:20.767407 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.767371 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee91099e-aa1c-4e71-a04c-dc3607d60301" containerName="extract" Apr 16 15:02:20.770191 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.770176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:20.772012 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.771995 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 15:02:20.772110 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.771996 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 15:02:20.772110 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.772092 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 15:02:20.772209 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.772095 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-7s7j2\"" Apr 16 15:02:20.780042 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.780021 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq"] Apr 16 15:02:20.901464 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.901429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtdbt\" (UniqueName: \"kubernetes.io/projected/79bfeb14-46ac-4efa-96a7-1ce9f7a70262-kube-api-access-vtdbt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq\" (UID: \"79bfeb14-46ac-4efa-96a7-1ce9f7a70262\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:20.901464 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:20.901467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/79bfeb14-46ac-4efa-96a7-1ce9f7a70262-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq\" (UID: \"79bfeb14-46ac-4efa-96a7-1ce9f7a70262\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:21.001909 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:21.001877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtdbt\" (UniqueName: \"kubernetes.io/projected/79bfeb14-46ac-4efa-96a7-1ce9f7a70262-kube-api-access-vtdbt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq\" (UID: \"79bfeb14-46ac-4efa-96a7-1ce9f7a70262\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:21.001909 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:21.001909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/79bfeb14-46ac-4efa-96a7-1ce9f7a70262-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq\" (UID: \"79bfeb14-46ac-4efa-96a7-1ce9f7a70262\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:21.004145 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:21.004126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/79bfeb14-46ac-4efa-96a7-1ce9f7a70262-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq\" (UID: \"79bfeb14-46ac-4efa-96a7-1ce9f7a70262\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:21.008841 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:21.008818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtdbt\" (UniqueName: \"kubernetes.io/projected/79bfeb14-46ac-4efa-96a7-1ce9f7a70262-kube-api-access-vtdbt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq\" (UID: \"79bfeb14-46ac-4efa-96a7-1ce9f7a70262\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:21.080061 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:21.079987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:21.198346 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:21.198301 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq"] Apr 16 15:02:21.202456 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:02:21.202431 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79bfeb14_46ac_4efa_96a7_1ce9f7a70262.slice/crio-de50f345717e2b3e51fe1d447876731d47b79af56bf45d548b47880816c198eb WatchSource:0}: Error finding container de50f345717e2b3e51fe1d447876731d47b79af56bf45d548b47880816c198eb: Status 404 returned error can't find the container with id de50f345717e2b3e51fe1d447876731d47b79af56bf45d548b47880816c198eb Apr 16 15:02:21.621785 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:21.621733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" event={"ID":"79bfeb14-46ac-4efa-96a7-1ce9f7a70262","Type":"ContainerStarted","Data":"de50f345717e2b3e51fe1d447876731d47b79af56bf45d548b47880816c198eb"} Apr 16 15:02:25.259824 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.259789 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-mmflt"] Apr 16 15:02:25.262971 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.262954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.264827 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.264806 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 15:02:25.264827 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.264825 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-2mvd5\"" Apr 16 15:02:25.265005 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.264874 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 15:02:25.268971 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.268950 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-mmflt"] Apr 16 15:02:25.344135 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.344104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzs26\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-kube-api-access-wzs26\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.344275 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.344202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.344275 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.344231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2fa9522a-d304-4f56-95e7-cbf32e96c224-cabundle0\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.445231 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.445200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2fa9522a-d304-4f56-95e7-cbf32e96c224-cabundle0\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.445429 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.445243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzs26\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-kube-api-access-wzs26\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.445429 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.445302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.445429 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.445408 2576 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 15:02:25.445429 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.445427 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:02:25.445601 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.445435 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:02:25.445601 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.445451 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-mmflt: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 15:02:25.445601 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.445511 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates podName:2fa9522a-d304-4f56-95e7-cbf32e96c224 nodeName:}" failed. No retries permitted until 2026-04-16 15:02:25.94549381 +0000 UTC m=+596.863855011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates") pod "keda-operator-ffbb595cb-mmflt" (UID: "2fa9522a-d304-4f56-95e7-cbf32e96c224") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 15:02:25.445917 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.445899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2fa9522a-d304-4f56-95e7-cbf32e96c224-cabundle0\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.452964 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.452935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzs26\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-kube-api-access-wzs26\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.529973 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.529901 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm"] Apr 16 15:02:25.533350 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.533334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:25.535047 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.535026 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 15:02:25.540965 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.540944 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm"] Apr 16 15:02:25.637186 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.637150 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" event={"ID":"79bfeb14-46ac-4efa-96a7-1ce9f7a70262","Type":"ContainerStarted","Data":"b2c808d32fb98b5a60b94d8c58064f0f67a3f8a162d3ca03350ccbc877a70b36"} Apr 16 15:02:25.637364 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.637327 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:25.651459 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.651425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:25.651608 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.651482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm6t2\" (UniqueName: \"kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-kube-api-access-bm6t2\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:25.651608 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.651548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/3cf495c5-f222-4ef0-b820-2aff2850a2f2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:25.651961 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.651917 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" podStartSLOduration=2.138723057 podStartE2EDuration="5.651906169s" podCreationTimestamp="2026-04-16 15:02:20 +0000 UTC" firstStartedPulling="2026-04-16 15:02:21.204034477 +0000 UTC m=+592.122395660" lastFinishedPulling="2026-04-16 15:02:24.717217583 +0000 UTC m=+595.635578772" observedRunningTime="2026-04-16 15:02:25.651616796 +0000 UTC m=+596.569978001" watchObservedRunningTime="2026-04-16 15:02:25.651906169 +0000 UTC m=+596.570267374" Apr 16 15:02:25.753470 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.753431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bm6t2\" (UniqueName: \"kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-kube-api-access-bm6t2\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:25.753634 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.753480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/3cf495c5-f222-4ef0-b820-2aff2850a2f2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:25.753686 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.753651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:25.753863 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.753834 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:02:25.753863 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.753859 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:02:25.753994 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.753880 2576 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 15:02:25.753994 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.753901 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 15:02:25.753994 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.753952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/3cf495c5-f222-4ef0-b820-2aff2850a2f2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:25.753994 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.753964 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates podName:3cf495c5-f222-4ef0-b820-2aff2850a2f2 nodeName:}" failed. No retries permitted until 2026-04-16 15:02:26.25394478 +0000 UTC m=+597.172305977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates") pod "keda-metrics-apiserver-7c9f485588-mlktm" (UID: "3cf495c5-f222-4ef0-b820-2aff2850a2f2") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 15:02:25.761681 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.761652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm6t2\" (UniqueName: \"kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-kube-api-access-bm6t2\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:25.888841 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.888754 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6vfbt"] Apr 16 15:02:25.892132 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.892115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:02:25.893838 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.893818 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 15:02:25.899395 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.899377 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6vfbt"] Apr 16 15:02:25.955950 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:25.955924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:25.956118 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.956098 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:02:25.956191 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.956123 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:02:25.956191 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.956136 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-mmflt: references non-existent secret key: ca.crt Apr 16 15:02:25.956288 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:25.956208 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates podName:2fa9522a-d304-4f56-95e7-cbf32e96c224 nodeName:}" failed. No retries permitted until 2026-04-16 15:02:26.956189514 +0000 UTC m=+597.874550721 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates") pod "keda-operator-ffbb595cb-mmflt" (UID: "2fa9522a-d304-4f56-95e7-cbf32e96c224") : references non-existent secret key: ca.crt Apr 16 15:02:26.057341 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.057299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4cf90981-6984-47f3-bf27-9528de0f1500-certificates\") pod \"keda-admission-cf49989db-6vfbt\" (UID: \"4cf90981-6984-47f3-bf27-9528de0f1500\") " pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:02:26.057537 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.057420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9jkh\" (UniqueName: \"kubernetes.io/projected/4cf90981-6984-47f3-bf27-9528de0f1500-kube-api-access-x9jkh\") pod \"keda-admission-cf49989db-6vfbt\" (UID: \"4cf90981-6984-47f3-bf27-9528de0f1500\") " pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:02:26.158339 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.158301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9jkh\" (UniqueName: \"kubernetes.io/projected/4cf90981-6984-47f3-bf27-9528de0f1500-kube-api-access-x9jkh\") pod \"keda-admission-cf49989db-6vfbt\" (UID: \"4cf90981-6984-47f3-bf27-9528de0f1500\") " pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:02:26.158525 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.158420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4cf90981-6984-47f3-bf27-9528de0f1500-certificates\") pod \"keda-admission-cf49989db-6vfbt\" (UID: \"4cf90981-6984-47f3-bf27-9528de0f1500\") " pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:02:26.160872 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.160836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4cf90981-6984-47f3-bf27-9528de0f1500-certificates\") pod \"keda-admission-cf49989db-6vfbt\" (UID: \"4cf90981-6984-47f3-bf27-9528de0f1500\") " pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:02:26.166584 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.166557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9jkh\" (UniqueName: \"kubernetes.io/projected/4cf90981-6984-47f3-bf27-9528de0f1500-kube-api-access-x9jkh\") pod \"keda-admission-cf49989db-6vfbt\" (UID: \"4cf90981-6984-47f3-bf27-9528de0f1500\") " pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:02:26.202542 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.202509 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:02:26.259780 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.259218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:26.259780 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:26.259439 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:02:26.259780 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:26.259455 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:02:26.259780 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:26.259477 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm: references non-existent secret key: tls.crt Apr 16 15:02:26.259780 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:26.259535 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates podName:3cf495c5-f222-4ef0-b820-2aff2850a2f2 nodeName:}" failed. No retries permitted until 2026-04-16 15:02:27.259516656 +0000 UTC m=+598.177877843 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates") pod "keda-metrics-apiserver-7c9f485588-mlktm" (UID: "3cf495c5-f222-4ef0-b820-2aff2850a2f2") : references non-existent secret key: tls.crt Apr 16 15:02:26.346969 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.346935 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6vfbt"] Apr 16 15:02:26.350039 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:02:26.350010 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf90981_6984_47f3_bf27_9528de0f1500.slice/crio-b7339ff9c49de8ec84b123d0bde5400dcf0f1a3e0aac265605d5003b713da14f WatchSource:0}: Error finding container b7339ff9c49de8ec84b123d0bde5400dcf0f1a3e0aac265605d5003b713da14f: Status 404 returned error can't find the container with id b7339ff9c49de8ec84b123d0bde5400dcf0f1a3e0aac265605d5003b713da14f Apr 16 15:02:26.641736 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.641649 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6vfbt" event={"ID":"4cf90981-6984-47f3-bf27-9528de0f1500","Type":"ContainerStarted","Data":"b7339ff9c49de8ec84b123d0bde5400dcf0f1a3e0aac265605d5003b713da14f"} Apr 16 15:02:26.964430 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:26.964392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:26.964616 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:26.964525 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:02:26.964616 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:26.964542 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:02:26.964616 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:26.964552 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-mmflt: references non-existent secret key: ca.crt Apr 16 15:02:26.964616 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:26.964605 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates podName:2fa9522a-d304-4f56-95e7-cbf32e96c224 nodeName:}" failed. No retries permitted until 2026-04-16 15:02:28.96459028 +0000 UTC m=+599.882951462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates") pod "keda-operator-ffbb595cb-mmflt" (UID: "2fa9522a-d304-4f56-95e7-cbf32e96c224") : references non-existent secret key: ca.crt Apr 16 15:02:27.268063 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:27.267980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:27.268488 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:27.268131 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:02:27.268488 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:27.268152 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:02:27.268488 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:27.268171 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm: references non-existent secret key: tls.crt Apr 16 15:02:27.268488 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:27.268230 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates podName:3cf495c5-f222-4ef0-b820-2aff2850a2f2 nodeName:}" failed. No retries permitted until 2026-04-16 15:02:29.268211167 +0000 UTC m=+600.186572353 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates") pod "keda-metrics-apiserver-7c9f485588-mlktm" (UID: "3cf495c5-f222-4ef0-b820-2aff2850a2f2") : references non-existent secret key: tls.crt Apr 16 15:02:28.650700 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:28.650663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6vfbt" event={"ID":"4cf90981-6984-47f3-bf27-9528de0f1500","Type":"ContainerStarted","Data":"a7ca07f6bd3abcb9527e52ab8bc35059ac89cd9f7765f60900fd2bea3330c916"} Apr 16 15:02:28.651155 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:28.650723 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:02:28.665489 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:28.665447 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6vfbt" podStartSLOduration=2.17808903 podStartE2EDuration="3.665432752s" podCreationTimestamp="2026-04-16 15:02:25 +0000 UTC" firstStartedPulling="2026-04-16 15:02:26.351363666 +0000 UTC m=+597.269724850" lastFinishedPulling="2026-04-16 15:02:27.838707385 +0000 UTC m=+598.757068572" observedRunningTime="2026-04-16 15:02:28.664218923 +0000 UTC m=+599.582580130" watchObservedRunningTime="2026-04-16 15:02:28.665432752 +0000 UTC m=+599.583793974" Apr 16 15:02:28.983024 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:28.982934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:28.983172 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:28.983078 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:02:28.983172 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:28.983099 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:02:28.983172 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:28.983109 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-mmflt: references non-existent secret key: ca.crt Apr 16 15:02:28.983172 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:02:28.983159 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates podName:2fa9522a-d304-4f56-95e7-cbf32e96c224 nodeName:}" failed. No retries permitted until 2026-04-16 15:02:32.983144165 +0000 UTC m=+603.901505347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates") pod "keda-operator-ffbb595cb-mmflt" (UID: "2fa9522a-d304-4f56-95e7-cbf32e96c224") : references non-existent secret key: ca.crt Apr 16 15:02:29.286525 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:29.286442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:29.289109 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:29.289087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3cf495c5-f222-4ef0-b820-2aff2850a2f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mlktm\" (UID: \"3cf495c5-f222-4ef0-b820-2aff2850a2f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:29.444879 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:29.444842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:29.563494 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:29.563416 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm"] Apr 16 15:02:29.566434 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:02:29.566407 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cf495c5_f222_4ef0_b820_2aff2850a2f2.slice/crio-49014e70f47fb53ee60c370831f542f8462362058b192a9c88cec530e9b8f9e2 WatchSource:0}: Error finding container 49014e70f47fb53ee60c370831f542f8462362058b192a9c88cec530e9b8f9e2: Status 404 returned error can't find the container with id 49014e70f47fb53ee60c370831f542f8462362058b192a9c88cec530e9b8f9e2 Apr 16 15:02:29.661102 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:29.661059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" event={"ID":"3cf495c5-f222-4ef0-b820-2aff2850a2f2","Type":"ContainerStarted","Data":"49014e70f47fb53ee60c370831f542f8462362058b192a9c88cec530e9b8f9e2"} Apr 16 15:02:29.677173 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:29.677149 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:02:29.677593 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:29.677574 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:02:33.020634 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:33.020601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:33.023129 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:33.023094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2fa9522a-d304-4f56-95e7-cbf32e96c224-certificates\") pod \"keda-operator-ffbb595cb-mmflt\" (UID: \"2fa9522a-d304-4f56-95e7-cbf32e96c224\") " pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:33.074255 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:33.074216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:33.195213 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:33.195185 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-mmflt"] Apr 16 15:02:33.197705 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:02:33.197667 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa9522a_d304_4f56_95e7_cbf32e96c224.slice/crio-4f3ddf7f3d26eedc87a843e9dbea8028d176b8c87fe66add4cd42fe0487e4707 WatchSource:0}: Error finding container 4f3ddf7f3d26eedc87a843e9dbea8028d176b8c87fe66add4cd42fe0487e4707: Status 404 returned error can't find the container with id 4f3ddf7f3d26eedc87a843e9dbea8028d176b8c87fe66add4cd42fe0487e4707 Apr 16 15:02:33.676789 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:33.676746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" event={"ID":"3cf495c5-f222-4ef0-b820-2aff2850a2f2","Type":"ContainerStarted","Data":"fc8545aaf94eb2435e259bdcd63a21a11fdc8f12e68b4899f943efcba39b0379"} Apr 16 15:02:33.677018 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:33.676998 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:33.677840 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:33.677818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-mmflt" event={"ID":"2fa9522a-d304-4f56-95e7-cbf32e96c224","Type":"ContainerStarted","Data":"4f3ddf7f3d26eedc87a843e9dbea8028d176b8c87fe66add4cd42fe0487e4707"} Apr 16 15:02:33.692672 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:33.692633 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" podStartSLOduration=5.380098426 podStartE2EDuration="8.69261739s" podCreationTimestamp="2026-04-16 15:02:25 +0000 UTC" firstStartedPulling="2026-04-16 15:02:29.568191822 +0000 UTC m=+600.486553006" lastFinishedPulling="2026-04-16 15:02:32.880710784 +0000 UTC m=+603.799071970" observedRunningTime="2026-04-16 15:02:33.6910686 +0000 UTC m=+604.609429805" watchObservedRunningTime="2026-04-16 15:02:33.69261739 +0000 UTC m=+604.610978597" Apr 16 15:02:39.700700 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:39.700666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-mmflt" event={"ID":"2fa9522a-d304-4f56-95e7-cbf32e96c224","Type":"ContainerStarted","Data":"6ac1feb0b1d0cba98c50ec8dc2e56b8a91b7d4de3a27ded81a5063fb93108b68"} Apr 16 15:02:39.701103 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:39.700818 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:02:39.716237 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:39.716191 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-mmflt" podStartSLOduration=8.773326258 podStartE2EDuration="14.716176277s" podCreationTimestamp="2026-04-16 15:02:25 +0000 UTC" firstStartedPulling="2026-04-16 15:02:33.199099868 +0000 UTC m=+604.117461051" lastFinishedPulling="2026-04-16 15:02:39.141949885 +0000 UTC m=+610.060311070" observedRunningTime="2026-04-16 15:02:39.714230475 +0000 UTC m=+610.632591680" watchObservedRunningTime="2026-04-16 15:02:39.716176277 +0000 UTC m=+610.634537482" Apr 16 15:02:44.686045 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:44.686015 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mlktm" Apr 16 15:02:46.644304 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:46.644271 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k9bkq" Apr 16 15:02:49.663787 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:02:49.663748 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6vfbt" Apr 16 15:03:00.707512 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:00.707483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-mmflt" Apr 16 15:03:31.402608 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.402569 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl"] Apr 16 15:03:31.404641 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.404626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:03:31.406350 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.406327 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 15:03:31.406473 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.406457 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-t24mx\"" Apr 16 15:03:31.406875 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.406861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:03:31.407172 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.407156 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:03:31.414393 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.414372 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl"] Apr 16 15:03:31.434219 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.434195 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-hnwm2"] Apr 16 15:03:31.436346 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.436326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:03:31.438132 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.438106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-r6gvl\"" Apr 16 15:03:31.438132 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.438131 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 15:03:31.442546 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.442521 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hnwm2"] Apr 16 15:03:31.505123 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.505079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3b3ec15b-9b86-4feb-876d-51e5a458081d-data\") pod \"seaweedfs-86cc847c5c-hnwm2\" (UID: \"3b3ec15b-9b86-4feb-876d-51e5a458081d\") " pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:03:31.505305 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.505275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljmp2\" (UniqueName: \"kubernetes.io/projected/3b3ec15b-9b86-4feb-876d-51e5a458081d-kube-api-access-ljmp2\") pod \"seaweedfs-86cc847c5c-hnwm2\" (UID: \"3b3ec15b-9b86-4feb-876d-51e5a458081d\") " pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:03:31.606261 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.606229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4607d733-d1b9-418f-9925-2ff595da859c-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xj6kl\" (UID: \"4607d733-d1b9-418f-9925-2ff595da859c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:03:31.606510 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.606294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3b3ec15b-9b86-4feb-876d-51e5a458081d-data\") pod \"seaweedfs-86cc847c5c-hnwm2\" (UID: \"3b3ec15b-9b86-4feb-876d-51e5a458081d\") " pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:03:31.606510 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.606341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljmp2\" (UniqueName: \"kubernetes.io/projected/3b3ec15b-9b86-4feb-876d-51e5a458081d-kube-api-access-ljmp2\") pod \"seaweedfs-86cc847c5c-hnwm2\" (UID: \"3b3ec15b-9b86-4feb-876d-51e5a458081d\") " pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:03:31.606510 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.606391 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwx6m\" (UniqueName: \"kubernetes.io/projected/4607d733-d1b9-418f-9925-2ff595da859c-kube-api-access-cwx6m\") pod \"llmisvc-controller-manager-68cc5db7c4-xj6kl\" (UID: \"4607d733-d1b9-418f-9925-2ff595da859c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:03:31.606686 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.606668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3b3ec15b-9b86-4feb-876d-51e5a458081d-data\") pod \"seaweedfs-86cc847c5c-hnwm2\" (UID: \"3b3ec15b-9b86-4feb-876d-51e5a458081d\") " pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:03:31.613760 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.613734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljmp2\" (UniqueName: \"kubernetes.io/projected/3b3ec15b-9b86-4feb-876d-51e5a458081d-kube-api-access-ljmp2\") pod \"seaweedfs-86cc847c5c-hnwm2\" (UID: \"3b3ec15b-9b86-4feb-876d-51e5a458081d\") " pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:03:31.707605 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.707502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwx6m\" (UniqueName: \"kubernetes.io/projected/4607d733-d1b9-418f-9925-2ff595da859c-kube-api-access-cwx6m\") pod \"llmisvc-controller-manager-68cc5db7c4-xj6kl\" (UID: \"4607d733-d1b9-418f-9925-2ff595da859c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:03:31.707605 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.707553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4607d733-d1b9-418f-9925-2ff595da859c-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xj6kl\" (UID: \"4607d733-d1b9-418f-9925-2ff595da859c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:03:31.710095 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.710069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4607d733-d1b9-418f-9925-2ff595da859c-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xj6kl\" (UID: \"4607d733-d1b9-418f-9925-2ff595da859c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:03:31.714711 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.714686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwx6m\" (UniqueName: \"kubernetes.io/projected/4607d733-d1b9-418f-9925-2ff595da859c-kube-api-access-cwx6m\") pod \"llmisvc-controller-manager-68cc5db7c4-xj6kl\" (UID: \"4607d733-d1b9-418f-9925-2ff595da859c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:03:31.747642 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.747604 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:03:31.869208 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:31.869170 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hnwm2"] Apr 16 15:03:31.872275 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:03:31.872244 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b3ec15b_9b86_4feb_876d_51e5a458081d.slice/crio-dc6151273418cd49babb0eee317c5faca87f9f2cf45d594ebb6fb1b55d6d290b WatchSource:0}: Error finding container dc6151273418cd49babb0eee317c5faca87f9f2cf45d594ebb6fb1b55d6d290b: Status 404 returned error can't find the container with id dc6151273418cd49babb0eee317c5faca87f9f2cf45d594ebb6fb1b55d6d290b Apr 16 15:03:32.015509 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:32.015428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:03:32.136227 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:32.136204 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl"] Apr 16 15:03:32.138589 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:03:32.138561 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4607d733_d1b9_418f_9925_2ff595da859c.slice/crio-6d0377c9ab16efd8d525c10c0e2bbd032d6b1b539e2e1a882da4eb268ed9bf4f WatchSource:0}: Error finding container 6d0377c9ab16efd8d525c10c0e2bbd032d6b1b539e2e1a882da4eb268ed9bf4f: Status 404 returned error can't find the container with id 6d0377c9ab16efd8d525c10c0e2bbd032d6b1b539e2e1a882da4eb268ed9bf4f Apr 16 15:03:32.879459 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:32.879414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" event={"ID":"4607d733-d1b9-418f-9925-2ff595da859c","Type":"ContainerStarted","Data":"6d0377c9ab16efd8d525c10c0e2bbd032d6b1b539e2e1a882da4eb268ed9bf4f"} Apr 16 15:03:32.880963 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:32.880802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hnwm2" event={"ID":"3b3ec15b-9b86-4feb-876d-51e5a458081d","Type":"ContainerStarted","Data":"dc6151273418cd49babb0eee317c5faca87f9f2cf45d594ebb6fb1b55d6d290b"} Apr 16 15:03:35.893274 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:35.893243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hnwm2" event={"ID":"3b3ec15b-9b86-4feb-876d-51e5a458081d","Type":"ContainerStarted","Data":"7eac0b01417b832c784450a6d7d8ed3baf5185c9f15f255926439a21d2203cab"} Apr 16 15:03:35.893660 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:35.893305 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:03:35.894569 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:35.894548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" event={"ID":"4607d733-d1b9-418f-9925-2ff595da859c","Type":"ContainerStarted","Data":"ebbda5f8b304ddce0eaf067b18216d4e053700b108113f70135bfef558638723"} Apr 16 15:03:35.894671 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:35.894661 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:03:35.909378 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:35.909338 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-hnwm2" podStartSLOduration=1.820691769 podStartE2EDuration="4.909324501s" podCreationTimestamp="2026-04-16 15:03:31 +0000 UTC" firstStartedPulling="2026-04-16 15:03:31.873528589 +0000 UTC m=+662.791889772" lastFinishedPulling="2026-04-16 15:03:34.962161321 +0000 UTC m=+665.880522504" observedRunningTime="2026-04-16 15:03:35.907716941 +0000 UTC m=+666.826078146" watchObservedRunningTime="2026-04-16 15:03:35.909324501 +0000 UTC m=+666.827685758" Apr 16 15:03:35.921229 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:35.921193 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" podStartSLOduration=2.134443952 podStartE2EDuration="4.921183844s" podCreationTimestamp="2026-04-16 15:03:31 +0000 UTC" firstStartedPulling="2026-04-16 15:03:32.140124838 +0000 UTC m=+663.058486021" lastFinishedPulling="2026-04-16 15:03:34.92686473 +0000 UTC m=+665.845225913" observedRunningTime="2026-04-16 15:03:35.920869202 +0000 UTC m=+666.839230421" watchObservedRunningTime="2026-04-16 15:03:35.921183844 +0000 UTC m=+666.839545049" Apr 16 15:03:41.899910 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:03:41.899884 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-hnwm2" Apr 16 15:04:06.900417 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:04:06.900386 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xj6kl" Apr 16 15:05:17.164540 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.164504 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9"] Apr 16 15:05:17.167961 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.167944 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" Apr 16 15:05:17.169794 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.169754 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5c2zz\"" Apr 16 15:05:17.174555 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.174528 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9"] Apr 16 15:05:17.251422 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.251389 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn"] Apr 16 15:05:17.254860 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.254841 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" Apr 16 15:05:17.264631 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.264607 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn"] Apr 16 15:05:17.265129 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.265114 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" Apr 16 15:05:17.276103 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.276078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf610c8-a8fa-418e-ba35-19428a90c024-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-695857997-skcx9\" (UID: \"dbf610c8-a8fa-418e-ba35-19428a90c024\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" Apr 16 15:05:17.377405 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.377374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf610c8-a8fa-418e-ba35-19428a90c024-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-695857997-skcx9\" (UID: \"dbf610c8-a8fa-418e-ba35-19428a90c024\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" Apr 16 15:05:17.377852 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.377830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf610c8-a8fa-418e-ba35-19428a90c024-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-695857997-skcx9\" (UID: \"dbf610c8-a8fa-418e-ba35-19428a90c024\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" Apr 16 15:05:17.396329 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.396307 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn"] Apr 16 15:05:17.398988 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:05:17.398958 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70861a45_bf20_471b_abae_c5a854e55276.slice/crio-d96dfe3d750d0eb92ea8e979b3ad0783dd748919ce53a02c49f5791a0b030ec6 WatchSource:0}: Error finding container d96dfe3d750d0eb92ea8e979b3ad0783dd748919ce53a02c49f5791a0b030ec6: Status 404 returned error can't find the container with id d96dfe3d750d0eb92ea8e979b3ad0783dd748919ce53a02c49f5791a0b030ec6 Apr 16 15:05:17.478400 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.478287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" Apr 16 15:05:17.615032 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:17.613117 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9"] Apr 16 15:05:18.238080 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:18.238022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" event={"ID":"dbf610c8-a8fa-418e-ba35-19428a90c024","Type":"ContainerStarted","Data":"8acdeca2a4106fe824cbf157092b27ddbd0fe71a0951bf74e2217bea9cf72364"} Apr 16 15:05:18.242125 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:18.242092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" event={"ID":"70861a45-bf20-471b-abae-c5a854e55276","Type":"ContainerStarted","Data":"d96dfe3d750d0eb92ea8e979b3ad0783dd748919ce53a02c49f5791a0b030ec6"} Apr 16 15:05:31.308111 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:31.308071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" event={"ID":"dbf610c8-a8fa-418e-ba35-19428a90c024","Type":"ContainerStarted","Data":"cc836098ec1df77cfc1a82b8e5ce3a46e1c81ea7397fb1223392066dfc476473"} Apr 16 15:05:31.309456 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:31.309431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" event={"ID":"70861a45-bf20-471b-abae-c5a854e55276","Type":"ContainerStarted","Data":"26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1"} Apr 16 15:05:31.309648 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:31.309624 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" Apr 16 15:05:31.311031 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:31.311006 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:05:31.334638 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:31.334592 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" podStartSLOduration=0.891161153 podStartE2EDuration="14.334579452s" podCreationTimestamp="2026-04-16 15:05:17 +0000 UTC" firstStartedPulling="2026-04-16 15:05:17.400615772 +0000 UTC m=+768.318976955" lastFinishedPulling="2026-04-16 15:05:30.844034063 +0000 UTC m=+781.762395254" observedRunningTime="2026-04-16 15:05:31.333727644 +0000 UTC m=+782.252088850" watchObservedRunningTime="2026-04-16 15:05:31.334579452 +0000 UTC m=+782.252940657" Apr 16 15:05:32.313012 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:32.312970 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:05:35.324789 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:35.324738 2576 generic.go:358] "Generic (PLEG): container finished" podID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerID="cc836098ec1df77cfc1a82b8e5ce3a46e1c81ea7397fb1223392066dfc476473" exitCode=0 Apr 16 15:05:35.325203 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:35.324812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" event={"ID":"dbf610c8-a8fa-418e-ba35-19428a90c024","Type":"ContainerDied","Data":"cc836098ec1df77cfc1a82b8e5ce3a46e1c81ea7397fb1223392066dfc476473"} Apr 16 15:05:42.313191 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:42.313149 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:05:42.352703 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:42.352671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" event={"ID":"dbf610c8-a8fa-418e-ba35-19428a90c024","Type":"ContainerStarted","Data":"785e0156168173280368d1af38a69dc2be30c405c2cdd199882a5cca4e617b4d"} Apr 16 15:05:42.353022 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:42.353001 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" Apr 16 15:05:42.354593 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:42.354563 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 15:05:42.370922 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:42.370872 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podStartSLOduration=1.335702588 podStartE2EDuration="25.370861329s" podCreationTimestamp="2026-04-16 15:05:17 +0000 UTC" firstStartedPulling="2026-04-16 15:05:17.616599804 +0000 UTC m=+768.534960987" lastFinishedPulling="2026-04-16 15:05:41.651758537 +0000 UTC m=+792.570119728" observedRunningTime="2026-04-16 15:05:42.369858234 +0000 UTC m=+793.288219450" watchObservedRunningTime="2026-04-16 15:05:42.370861329 +0000 UTC m=+793.289222536" Apr 16 15:05:43.356910 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:43.356863 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 15:05:52.313899 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:52.313850 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:05:53.357289 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:05:53.357234 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 15:06:02.313399 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:02.313356 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:06:03.357612 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:03.357567 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 15:06:12.313990 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:12.313947 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:06:13.357649 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:13.357607 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 15:06:22.313940 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:22.313906 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" Apr 16 15:06:23.357831 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:23.357790 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 15:06:33.357750 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:33.357697 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 15:06:36.973027 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:36.972993 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994"] Apr 16 15:06:36.976366 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:36.976348 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:36.978230 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:36.978208 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d1f99-kube-rbac-proxy-sar-config\"" Apr 16 15:06:36.978356 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:36.978275 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:06:36.978356 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:36.978298 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d1f99-serving-cert\"" Apr 16 15:06:36.982374 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:36.982351 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994"] Apr 16 15:06:37.074561 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:37.074521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2203ad-4be6-4d90-a337-521909c216ac-proxy-tls\") pod \"switch-graph-d1f99-5f9847b898-pp994\" (UID: \"8f2203ad-4be6-4d90-a337-521909c216ac\") " pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:37.074742 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:37.074595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f2203ad-4be6-4d90-a337-521909c216ac-openshift-service-ca-bundle\") pod \"switch-graph-d1f99-5f9847b898-pp994\" (UID: \"8f2203ad-4be6-4d90-a337-521909c216ac\") " pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:37.175021 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:37.174988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f2203ad-4be6-4d90-a337-521909c216ac-openshift-service-ca-bundle\") pod \"switch-graph-d1f99-5f9847b898-pp994\" (UID: \"8f2203ad-4be6-4d90-a337-521909c216ac\") " pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:37.175206 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:37.175076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2203ad-4be6-4d90-a337-521909c216ac-proxy-tls\") pod \"switch-graph-d1f99-5f9847b898-pp994\" (UID: \"8f2203ad-4be6-4d90-a337-521909c216ac\") " pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:37.175258 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:06:37.175223 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-d1f99-serving-cert: secret "switch-graph-d1f99-serving-cert" not found Apr 16 15:06:37.175300 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:06:37.175290 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f2203ad-4be6-4d90-a337-521909c216ac-proxy-tls podName:8f2203ad-4be6-4d90-a337-521909c216ac nodeName:}" failed. No retries permitted until 2026-04-16 15:06:37.67527406 +0000 UTC m=+848.593635243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8f2203ad-4be6-4d90-a337-521909c216ac-proxy-tls") pod "switch-graph-d1f99-5f9847b898-pp994" (UID: "8f2203ad-4be6-4d90-a337-521909c216ac") : secret "switch-graph-d1f99-serving-cert" not found Apr 16 15:06:37.175610 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:37.175590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f2203ad-4be6-4d90-a337-521909c216ac-openshift-service-ca-bundle\") pod \"switch-graph-d1f99-5f9847b898-pp994\" (UID: \"8f2203ad-4be6-4d90-a337-521909c216ac\") " pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:37.679085 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:37.679046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2203ad-4be6-4d90-a337-521909c216ac-proxy-tls\") pod \"switch-graph-d1f99-5f9847b898-pp994\" (UID: \"8f2203ad-4be6-4d90-a337-521909c216ac\") " pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:37.681454 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:37.681432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2203ad-4be6-4d90-a337-521909c216ac-proxy-tls\") pod \"switch-graph-d1f99-5f9847b898-pp994\" (UID: \"8f2203ad-4be6-4d90-a337-521909c216ac\") " pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:37.887469 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:37.887435 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:38.006275 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:38.006241 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994"] Apr 16 15:06:38.009276 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:06:38.009246 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2203ad_4be6_4d90_a337_521909c216ac.slice/crio-87234b58310fcf6e998c74c2f471cc74d93dc57417b279cf133b3693d955ef8b WatchSource:0}: Error finding container 87234b58310fcf6e998c74c2f471cc74d93dc57417b279cf133b3693d955ef8b: Status 404 returned error can't find the container with id 87234b58310fcf6e998c74c2f471cc74d93dc57417b279cf133b3693d955ef8b Apr 16 15:06:38.011058 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:38.011040 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:06:38.546135 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:38.546100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" event={"ID":"8f2203ad-4be6-4d90-a337-521909c216ac","Type":"ContainerStarted","Data":"87234b58310fcf6e998c74c2f471cc74d93dc57417b279cf133b3693d955ef8b"} Apr 16 15:06:39.554481 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:39.554440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" event={"ID":"8f2203ad-4be6-4d90-a337-521909c216ac","Type":"ContainerStarted","Data":"9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7"} Apr 16 15:06:39.554966 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:39.554600 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:39.567694 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:39.567647 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" podStartSLOduration=2.121136284 podStartE2EDuration="3.567629915s" podCreationTimestamp="2026-04-16 15:06:36 +0000 UTC" firstStartedPulling="2026-04-16 15:06:38.011163711 +0000 UTC m=+848.929524898" lastFinishedPulling="2026-04-16 15:06:39.457657332 +0000 UTC m=+850.376018529" observedRunningTime="2026-04-16 15:06:39.567524634 +0000 UTC m=+850.485885839" watchObservedRunningTime="2026-04-16 15:06:39.567629915 +0000 UTC m=+850.485991122" Apr 16 15:06:43.357323 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:43.357272 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 15:06:45.564328 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:45.564295 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:06:47.206408 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.206374 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994"] Apr 16 15:06:47.206792 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.206568 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" containerID="cri-o://9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7" gracePeriod=30 Apr 16 15:06:47.269523 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.269484 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn"] Apr 16 15:06:47.269827 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.269780 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" containerID="cri-o://26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1" gracePeriod=30 Apr 16 15:06:47.434216 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.434182 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7"] Apr 16 15:06:47.437710 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.437693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" Apr 16 15:06:47.442282 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.442257 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7"] Apr 16 15:06:47.447728 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.447708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" Apr 16 15:06:47.567953 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.567920 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7"] Apr 16 15:06:47.571568 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:06:47.571544 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad5a8c3_7ac8_4abc_9aaf_93066d37d18a.slice/crio-d650ba4bf36e32cd6ad42fac579f2da83d83f51b46c9af59471c668c12b72523 WatchSource:0}: Error finding container d650ba4bf36e32cd6ad42fac579f2da83d83f51b46c9af59471c668c12b72523: Status 404 returned error can't find the container with id d650ba4bf36e32cd6ad42fac579f2da83d83f51b46c9af59471c668c12b72523 Apr 16 15:06:47.581726 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:47.581703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" event={"ID":"0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a","Type":"ContainerStarted","Data":"d650ba4bf36e32cd6ad42fac579f2da83d83f51b46c9af59471c668c12b72523"} Apr 16 15:06:48.586531 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:48.586496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" event={"ID":"0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a","Type":"ContainerStarted","Data":"82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919"} Apr 16 15:06:48.586950 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:48.586696 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" Apr 16 15:06:48.587960 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:48.587933 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:06:48.600972 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:48.600925 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" podStartSLOduration=1.600911175 podStartE2EDuration="1.600911175s" podCreationTimestamp="2026-04-16 15:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:06:48.599404268 +0000 UTC m=+859.517765474" watchObservedRunningTime="2026-04-16 15:06:48.600911175 +0000 UTC m=+859.519272358" Apr 16 15:06:49.590036 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:49.589992 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:06:50.314738 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.314715 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" Apr 16 15:06:50.562801 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.562690 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:06:50.593614 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.593576 2576 generic.go:358] "Generic (PLEG): container finished" podID="70861a45-bf20-471b-abae-c5a854e55276" containerID="26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1" exitCode=0 Apr 16 15:06:50.594141 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.593639 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" Apr 16 15:06:50.594141 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.593656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" event={"ID":"70861a45-bf20-471b-abae-c5a854e55276","Type":"ContainerDied","Data":"26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1"} Apr 16 15:06:50.594141 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.593694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn" event={"ID":"70861a45-bf20-471b-abae-c5a854e55276","Type":"ContainerDied","Data":"d96dfe3d750d0eb92ea8e979b3ad0783dd748919ce53a02c49f5791a0b030ec6"} Apr 16 15:06:50.594141 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.593710 2576 scope.go:117] "RemoveContainer" containerID="26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1" Apr 16 15:06:50.602656 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.602639 2576 scope.go:117] "RemoveContainer" containerID="26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1" Apr 16 15:06:50.602927 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:06:50.602907 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1\": container with ID starting with 26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1 not found: ID does not exist" containerID="26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1" Apr 16 15:06:50.603000 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.602936 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1"} err="failed to get container status \"26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1\": rpc error: code = NotFound desc = could not find container \"26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1\": container with ID starting with 26b254752e721808aaa966a908b1bc0edd4ed7ae9cb9fa8c2f097b4de9203bb1 not found: ID does not exist" Apr 16 15:06:50.613117 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.613096 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn"] Apr 16 15:06:50.616970 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:50.616949 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1f99-predictor-6f8c58666-kmthn"] Apr 16 15:06:51.806518 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:51.806442 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70861a45-bf20-471b-abae-c5a854e55276" path="/var/lib/kubelet/pods/70861a45-bf20-471b-abae-c5a854e55276/volumes" Apr 16 15:06:53.357885 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:53.357859 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" Apr 16 15:06:55.562416 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:55.562370 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:06:59.590845 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:06:59.590797 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:07:00.562249 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:00.562201 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:00.562399 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:00.562312 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:07:05.562075 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:05.562031 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:09.590578 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:09.590535 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:07:10.562647 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:10.562601 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:15.562651 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:15.562565 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:16.994827 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:16.994794 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-78ccc649-29b6r"] Apr 16 15:07:16.995173 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:16.995159 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" Apr 16 15:07:16.995173 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:16.995169 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" Apr 16 15:07:16.995245 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:16.995224 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="70861a45-bf20-471b-abae-c5a854e55276" containerName="kserve-container" Apr 16 15:07:16.999739 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:16.999723 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:17.001915 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.001868 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 16 15:07:17.004459 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.004434 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-78ccc649-29b6r"] Apr 16 15:07:17.010681 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.010657 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 16 15:07:17.017262 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.017244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e436b01-46e2-41fc-ac7e-2d4d765f0220-openshift-service-ca-bundle\") pod \"model-chainer-78ccc649-29b6r\" (UID: \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\") " pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:17.017341 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.017294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e436b01-46e2-41fc-ac7e-2d4d765f0220-proxy-tls\") pod \"model-chainer-78ccc649-29b6r\" (UID: \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\") " pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:17.117916 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.117884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e436b01-46e2-41fc-ac7e-2d4d765f0220-proxy-tls\") pod \"model-chainer-78ccc649-29b6r\" (UID: \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\") " pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:17.118085 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.117944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e436b01-46e2-41fc-ac7e-2d4d765f0220-openshift-service-ca-bundle\") pod \"model-chainer-78ccc649-29b6r\" (UID: \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\") " pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:17.118085 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:07:17.118026 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 16 15:07:17.118160 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:07:17.118103 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e436b01-46e2-41fc-ac7e-2d4d765f0220-proxy-tls podName:5e436b01-46e2-41fc-ac7e-2d4d765f0220 nodeName:}" failed. No retries permitted until 2026-04-16 15:07:17.618087238 +0000 UTC m=+888.536448420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5e436b01-46e2-41fc-ac7e-2d4d765f0220-proxy-tls") pod "model-chainer-78ccc649-29b6r" (UID: "5e436b01-46e2-41fc-ac7e-2d4d765f0220") : secret "model-chainer-serving-cert" not found Apr 16 15:07:17.118544 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.118527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e436b01-46e2-41fc-ac7e-2d4d765f0220-openshift-service-ca-bundle\") pod \"model-chainer-78ccc649-29b6r\" (UID: \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\") " pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:17.353012 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.352991 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:07:17.420195 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.420164 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2203ad-4be6-4d90-a337-521909c216ac-proxy-tls\") pod \"8f2203ad-4be6-4d90-a337-521909c216ac\" (UID: \"8f2203ad-4be6-4d90-a337-521909c216ac\") " Apr 16 15:07:17.420195 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.420197 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f2203ad-4be6-4d90-a337-521909c216ac-openshift-service-ca-bundle\") pod \"8f2203ad-4be6-4d90-a337-521909c216ac\" (UID: \"8f2203ad-4be6-4d90-a337-521909c216ac\") " Apr 16 15:07:17.420598 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.420574 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f2203ad-4be6-4d90-a337-521909c216ac-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8f2203ad-4be6-4d90-a337-521909c216ac" (UID: "8f2203ad-4be6-4d90-a337-521909c216ac"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:07:17.422363 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.422343 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f2203ad-4be6-4d90-a337-521909c216ac-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8f2203ad-4be6-4d90-a337-521909c216ac" (UID: "8f2203ad-4be6-4d90-a337-521909c216ac"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:07:17.521560 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.521528 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2203ad-4be6-4d90-a337-521909c216ac-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:07:17.521560 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.521556 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f2203ad-4be6-4d90-a337-521909c216ac-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:07:17.622997 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.622917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e436b01-46e2-41fc-ac7e-2d4d765f0220-proxy-tls\") pod \"model-chainer-78ccc649-29b6r\" (UID: \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\") " pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:17.625261 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.625240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e436b01-46e2-41fc-ac7e-2d4d765f0220-proxy-tls\") pod \"model-chainer-78ccc649-29b6r\" (UID: \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\") " pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:17.688959 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.688929 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f2203ad-4be6-4d90-a337-521909c216ac" containerID="9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7" exitCode=0 Apr 16 15:07:17.689119 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.688978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" event={"ID":"8f2203ad-4be6-4d90-a337-521909c216ac","Type":"ContainerDied","Data":"9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7"} Apr 16 15:07:17.689119 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.688990 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" Apr 16 15:07:17.689119 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.689002 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994" event={"ID":"8f2203ad-4be6-4d90-a337-521909c216ac","Type":"ContainerDied","Data":"87234b58310fcf6e998c74c2f471cc74d93dc57417b279cf133b3693d955ef8b"} Apr 16 15:07:17.689119 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.689018 2576 scope.go:117] "RemoveContainer" containerID="9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7" Apr 16 15:07:17.700402 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.700360 2576 scope.go:117] "RemoveContainer" containerID="9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7" Apr 16 15:07:17.700662 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:07:17.700644 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7\": container with ID starting with 9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7 not found: ID does not exist" containerID="9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7" Apr 16 15:07:17.700720 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.700671 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7"} err="failed to get container status \"9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7\": rpc error: code = NotFound desc = could not find container \"9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7\": container with ID starting with 9aa08489184d5a074dda31d227078470c291f5c1aacea6eec5a8c798e00f73a7 not found: ID does not exist" Apr 16 15:07:17.710466 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.710444 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994"] Apr 16 15:07:17.713139 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.713116 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d1f99-5f9847b898-pp994"] Apr 16 15:07:17.806566 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.806523 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" path="/var/lib/kubelet/pods/8f2203ad-4be6-4d90-a337-521909c216ac/volumes" Apr 16 15:07:17.911309 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:17.911285 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:18.031623 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:18.031597 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-78ccc649-29b6r"] Apr 16 15:07:18.033783 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:07:18.033748 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e436b01_46e2_41fc_ac7e_2d4d765f0220.slice/crio-e8243fda25be1477d9e7694a36099d34550472568dc71a596c7c210dfd66597d WatchSource:0}: Error finding container e8243fda25be1477d9e7694a36099d34550472568dc71a596c7c210dfd66597d: Status 404 returned error can't find the container with id e8243fda25be1477d9e7694a36099d34550472568dc71a596c7c210dfd66597d Apr 16 15:07:18.693582 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:18.693552 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" event={"ID":"5e436b01-46e2-41fc-ac7e-2d4d765f0220","Type":"ContainerStarted","Data":"1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c"} Apr 16 15:07:18.693582 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:18.693585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" event={"ID":"5e436b01-46e2-41fc-ac7e-2d4d765f0220","Type":"ContainerStarted","Data":"e8243fda25be1477d9e7694a36099d34550472568dc71a596c7c210dfd66597d"} Apr 16 15:07:18.693848 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:18.693612 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:18.708572 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:18.708531 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" podStartSLOduration=2.708514756 podStartE2EDuration="2.708514756s" podCreationTimestamp="2026-04-16 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:07:18.706127412 +0000 UTC m=+889.624488617" watchObservedRunningTime="2026-04-16 15:07:18.708514756 +0000 UTC m=+889.626875987" Apr 16 15:07:19.590880 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:19.590841 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:07:24.703114 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:24.703084 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:27.102543 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.102508 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-78ccc649-29b6r"] Apr 16 15:07:27.103025 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.102784 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" containerID="cri-o://1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c" gracePeriod=30 Apr 16 15:07:27.244262 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.244230 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9"] Apr 16 15:07:27.244530 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.244508 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" containerID="cri-o://785e0156168173280368d1af38a69dc2be30c405c2cdd199882a5cca4e617b4d" gracePeriod=30 Apr 16 15:07:27.346071 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.346036 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv"] Apr 16 15:07:27.346563 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.346545 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" Apr 16 15:07:27.346563 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.346564 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" Apr 16 15:07:27.346685 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.346619 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f2203ad-4be6-4d90-a337-521909c216ac" containerName="switch-graph-d1f99" Apr 16 15:07:27.349632 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.349617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" Apr 16 15:07:27.356508 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.356441 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv"] Apr 16 15:07:27.361322 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.361301 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" Apr 16 15:07:27.494574 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.493223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv"] Apr 16 15:07:27.726530 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.726497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" event={"ID":"30de4458-740a-4852-bfa7-6a75cea3e464","Type":"ContainerStarted","Data":"3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1"} Apr 16 15:07:27.726530 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.726534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" event={"ID":"30de4458-740a-4852-bfa7-6a75cea3e464","Type":"ContainerStarted","Data":"ad7cfda41d422abded963eef87bcf9482b8c79a7d17a22ac3d6b1f1cd6607c41"} Apr 16 15:07:27.726803 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.726660 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" Apr 16 15:07:27.727945 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.727911 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 15:07:27.740286 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:27.740239 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" podStartSLOduration=0.740226592 podStartE2EDuration="740.226592ms" podCreationTimestamp="2026-04-16 15:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:07:27.738561846 +0000 UTC m=+898.656923051" watchObservedRunningTime="2026-04-16 15:07:27.740226592 +0000 UTC m=+898.658587776" Apr 16 15:07:28.730372 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:28.730339 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 15:07:29.590184 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:29.590139 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:07:29.701910 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:29.701872 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:29.703705 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:29.703681 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:07:29.705538 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:29.705519 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:07:31.747024 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:31.746992 2576 generic.go:358] "Generic (PLEG): container finished" podID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerID="785e0156168173280368d1af38a69dc2be30c405c2cdd199882a5cca4e617b4d" exitCode=0 Apr 16 15:07:31.747356 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:31.747036 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" event={"ID":"dbf610c8-a8fa-418e-ba35-19428a90c024","Type":"ContainerDied","Data":"785e0156168173280368d1af38a69dc2be30c405c2cdd199882a5cca4e617b4d"} Apr 16 15:07:31.790040 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:31.790020 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" Apr 16 15:07:31.841544 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:31.841477 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf610c8-a8fa-418e-ba35-19428a90c024-kserve-provision-location\") pod \"dbf610c8-a8fa-418e-ba35-19428a90c024\" (UID: \"dbf610c8-a8fa-418e-ba35-19428a90c024\") " Apr 16 15:07:31.841831 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:31.841804 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf610c8-a8fa-418e-ba35-19428a90c024-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dbf610c8-a8fa-418e-ba35-19428a90c024" (UID: "dbf610c8-a8fa-418e-ba35-19428a90c024"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:07:31.942846 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:31.942816 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf610c8-a8fa-418e-ba35-19428a90c024-kserve-provision-location\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:07:32.752146 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:32.752111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" event={"ID":"dbf610c8-a8fa-418e-ba35-19428a90c024","Type":"ContainerDied","Data":"8acdeca2a4106fe824cbf157092b27ddbd0fe71a0951bf74e2217bea9cf72364"} Apr 16 15:07:32.752146 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:32.752136 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9" Apr 16 15:07:32.752595 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:32.752169 2576 scope.go:117] "RemoveContainer" containerID="785e0156168173280368d1af38a69dc2be30c405c2cdd199882a5cca4e617b4d" Apr 16 15:07:32.761377 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:32.761360 2576 scope.go:117] "RemoveContainer" containerID="cc836098ec1df77cfc1a82b8e5ce3a46e1c81ea7397fb1223392066dfc476473" Apr 16 15:07:32.773113 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:32.773089 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9"] Apr 16 15:07:32.777722 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:32.777699 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-695857997-skcx9"] Apr 16 15:07:33.806300 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:33.806264 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" path="/var/lib/kubelet/pods/dbf610c8-a8fa-418e-ba35-19428a90c024/volumes" Apr 16 15:07:34.701191 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:34.701152 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:38.731059 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:38.731016 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 15:07:39.591280 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:39.591244 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" Apr 16 15:07:39.701237 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:39.701191 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:39.701379 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:39.701298 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:41.697210 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:07:41.697177 2576 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/e9a90a0c3ff3679a79962a4e4a5fff3747163d297d400dc56db70576971fa0b7/diff" to get inode usage: stat /var/lib/containers/storage/overlay/e9a90a0c3ff3679a79962a4e4a5fff3747163d297d400dc56db70576971fa0b7/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-graph-1-predictor-695857997-skcx9_dbf610c8-a8fa-418e-ba35-19428a90c024/kserve-container/0.log" to get inode usage: stat /var/log/pods/kserve-ci-e2e-test_isvc-sklearn-graph-1-predictor-695857997-skcx9_dbf610c8-a8fa-418e-ba35-19428a90c024/kserve-container/0.log: no such file or directory Apr 16 15:07:44.701831 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:44.701757 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:48.731157 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:48.731112 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 15:07:49.701825 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:49.701789 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:54.701756 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:54.701718 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:57.125075 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:07:57.125041 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e436b01_46e2_41fc_ac7e_2d4d765f0220.slice/crio-conmon-1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:07:57.125440 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:07:57.125196 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e436b01_46e2_41fc_ac7e_2d4d765f0220.slice/crio-conmon-1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:07:57.125440 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:07:57.125207 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf610c8_a8fa_418e_ba35_19428a90c024.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e436b01_46e2_41fc_ac7e_2d4d765f0220.slice/crio-1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf610c8_a8fa_418e_ba35_19428a90c024.slice/crio-8acdeca2a4106fe824cbf157092b27ddbd0fe71a0951bf74e2217bea9cf72364\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf610c8_a8fa_418e_ba35_19428a90c024.slice/crio-conmon-785e0156168173280368d1af38a69dc2be30c405c2cdd199882a5cca4e617b4d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf610c8_a8fa_418e_ba35_19428a90c024.slice/crio-785e0156168173280368d1af38a69dc2be30c405c2cdd199882a5cca4e617b4d.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:07:57.125440 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:07:57.125292 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf610c8_a8fa_418e_ba35_19428a90c024.slice/crio-conmon-785e0156168173280368d1af38a69dc2be30c405c2cdd199882a5cca4e617b4d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf610c8_a8fa_418e_ba35_19428a90c024.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf610c8_a8fa_418e_ba35_19428a90c024.slice/crio-8acdeca2a4106fe824cbf157092b27ddbd0fe71a0951bf74e2217bea9cf72364\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e436b01_46e2_41fc_ac7e_2d4d765f0220.slice/crio-conmon-1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf610c8_a8fa_418e_ba35_19428a90c024.slice/crio-785e0156168173280368d1af38a69dc2be30c405c2cdd199882a5cca4e617b4d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e436b01_46e2_41fc_ac7e_2d4d765f0220.slice/crio-1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:07:57.444858 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.444824 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4"] Apr 16 15:07:57.445255 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.445239 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" Apr 16 15:07:57.445307 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.445259 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" Apr 16 15:07:57.445307 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.445275 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="storage-initializer" Apr 16 15:07:57.445307 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.445281 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="storage-initializer" Apr 16 15:07:57.445396 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.445348 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbf610c8-a8fa-418e-ba35-19428a90c024" containerName="kserve-container" Apr 16 15:07:57.448233 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.448219 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:07:57.450051 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.450026 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-c9f8b-kube-rbac-proxy-sar-config\"" Apr 16 15:07:57.450174 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.450076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-c9f8b-serving-cert\"" Apr 16 15:07:57.453749 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.453726 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4"] Apr 16 15:07:57.562980 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.562939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-proxy-tls\") pod \"switch-graph-c9f8b-5c958b9899-79ld4\" (UID: \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\") " pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:07:57.563168 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.563054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-openshift-service-ca-bundle\") pod \"switch-graph-c9f8b-5c958b9899-79ld4\" (UID: \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\") " pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:07:57.663511 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.663475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-proxy-tls\") pod \"switch-graph-c9f8b-5c958b9899-79ld4\" (UID: \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\") " pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:07:57.663679 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.663565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-openshift-service-ca-bundle\") pod \"switch-graph-c9f8b-5c958b9899-79ld4\" (UID: \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\") " pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:07:57.664302 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.664280 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-openshift-service-ca-bundle\") pod \"switch-graph-c9f8b-5c958b9899-79ld4\" (UID: \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\") " pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:07:57.666054 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.666028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-proxy-tls\") pod \"switch-graph-c9f8b-5c958b9899-79ld4\" (UID: \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\") " pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:07:57.745798 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.745755 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:57.759223 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.759186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:07:57.837997 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.837901 2576 generic.go:358] "Generic (PLEG): container finished" podID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerID="1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c" exitCode=0 Apr 16 15:07:57.837997 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.837967 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" Apr 16 15:07:57.837997 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.837985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" event={"ID":"5e436b01-46e2-41fc-ac7e-2d4d765f0220","Type":"ContainerDied","Data":"1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c"} Apr 16 15:07:57.838199 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.838024 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-78ccc649-29b6r" event={"ID":"5e436b01-46e2-41fc-ac7e-2d4d765f0220","Type":"ContainerDied","Data":"e8243fda25be1477d9e7694a36099d34550472568dc71a596c7c210dfd66597d"} Apr 16 15:07:57.838199 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.838039 2576 scope.go:117] "RemoveContainer" containerID="1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c" Apr 16 15:07:57.848795 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.848753 2576 scope.go:117] "RemoveContainer" containerID="1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c" Apr 16 15:07:57.849114 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:07:57.849092 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c\": container with ID starting with 1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c not found: ID does not exist" containerID="1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c" Apr 16 15:07:57.849200 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.849121 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c"} err="failed to get container status \"1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c\": rpc error: code = NotFound desc = could not find container \"1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c\": container with ID starting with 1ee74ac6b52352cfacb24c547e6f2ebac52b804be7ae33bd3c9b8eb8c9d2b09c not found: ID does not exist" Apr 16 15:07:57.865484 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.865458 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e436b01-46e2-41fc-ac7e-2d4d765f0220-openshift-service-ca-bundle\") pod \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\" (UID: \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\") " Apr 16 15:07:57.865589 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.865569 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e436b01-46e2-41fc-ac7e-2d4d765f0220-proxy-tls\") pod \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\" (UID: \"5e436b01-46e2-41fc-ac7e-2d4d765f0220\") " Apr 16 15:07:57.865864 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.865841 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e436b01-46e2-41fc-ac7e-2d4d765f0220-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5e436b01-46e2-41fc-ac7e-2d4d765f0220" (UID: "5e436b01-46e2-41fc-ac7e-2d4d765f0220"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:07:57.867462 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.867438 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e436b01-46e2-41fc-ac7e-2d4d765f0220-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5e436b01-46e2-41fc-ac7e-2d4d765f0220" (UID: "5e436b01-46e2-41fc-ac7e-2d4d765f0220"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:07:57.886578 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.886493 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4"] Apr 16 15:07:57.889054 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:07:57.889029 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6eed1c0_c2e6_44d1_a359_2a5495a3fb53.slice/crio-74a9de0165e6a637bbaa7f6eb09c532f490967bfb3febc8c421b79c4e3543bb3 WatchSource:0}: Error finding container 74a9de0165e6a637bbaa7f6eb09c532f490967bfb3febc8c421b79c4e3543bb3: Status 404 returned error can't find the container with id 74a9de0165e6a637bbaa7f6eb09c532f490967bfb3febc8c421b79c4e3543bb3 Apr 16 15:07:57.966756 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.966726 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e436b01-46e2-41fc-ac7e-2d4d765f0220-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:07:57.966756 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:57.966757 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e436b01-46e2-41fc-ac7e-2d4d765f0220-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:07:58.157422 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:58.157393 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-78ccc649-29b6r"] Apr 16 15:07:58.159194 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:58.159174 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-78ccc649-29b6r"] Apr 16 15:07:58.731174 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:58.731134 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 15:07:58.843651 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:58.843618 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" event={"ID":"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53","Type":"ContainerStarted","Data":"f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818"} Apr 16 15:07:58.843651 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:58.843651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" event={"ID":"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53","Type":"ContainerStarted","Data":"74a9de0165e6a637bbaa7f6eb09c532f490967bfb3febc8c421b79c4e3543bb3"} Apr 16 15:07:58.843860 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:58.843741 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:07:58.857687 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:58.857648 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" podStartSLOduration=1.8576361750000001 podStartE2EDuration="1.857636175s" podCreationTimestamp="2026-04-16 15:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:07:58.855501638 +0000 UTC m=+929.773862842" watchObservedRunningTime="2026-04-16 15:07:58.857636175 +0000 UTC m=+929.775997379" Apr 16 15:07:59.806959 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:07:59.806930 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" path="/var/lib/kubelet/pods/5e436b01-46e2-41fc-ac7e-2d4d765f0220/volumes" Apr 16 15:08:04.853259 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:04.853228 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:08:08.730826 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:08.730748 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 15:08:18.731956 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:18.731917 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" Apr 16 15:08:37.337454 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.337419 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5"] Apr 16 15:08:37.339781 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.337784 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" Apr 16 15:08:37.339781 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.337799 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" Apr 16 15:08:37.339781 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.337870 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e436b01-46e2-41fc-ac7e-2d4d765f0220" containerName="model-chainer" Apr 16 15:08:37.340708 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.340693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:37.342636 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.342605 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-2d44a-serving-cert\"" Apr 16 15:08:37.342636 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.342632 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-2d44a-kube-rbac-proxy-sar-config\"" Apr 16 15:08:37.347165 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.347141 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5"] Apr 16 15:08:37.491992 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.491959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43d29738-353c-4a39-ac31-5df0a1df037d-proxy-tls\") pod \"sequence-graph-2d44a-d4fd86597-7j4s5\" (UID: \"43d29738-353c-4a39-ac31-5df0a1df037d\") " pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:37.491992 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.491994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43d29738-353c-4a39-ac31-5df0a1df037d-openshift-service-ca-bundle\") pod \"sequence-graph-2d44a-d4fd86597-7j4s5\" (UID: \"43d29738-353c-4a39-ac31-5df0a1df037d\") " pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:37.593432 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.593344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43d29738-353c-4a39-ac31-5df0a1df037d-proxy-tls\") pod \"sequence-graph-2d44a-d4fd86597-7j4s5\" (UID: \"43d29738-353c-4a39-ac31-5df0a1df037d\") " pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:37.593432 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.593386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43d29738-353c-4a39-ac31-5df0a1df037d-openshift-service-ca-bundle\") pod \"sequence-graph-2d44a-d4fd86597-7j4s5\" (UID: \"43d29738-353c-4a39-ac31-5df0a1df037d\") " pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:37.593615 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:08:37.593495 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-2d44a-serving-cert: secret "sequence-graph-2d44a-serving-cert" not found Apr 16 15:08:37.593615 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:08:37.593573 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43d29738-353c-4a39-ac31-5df0a1df037d-proxy-tls podName:43d29738-353c-4a39-ac31-5df0a1df037d nodeName:}" failed. No retries permitted until 2026-04-16 15:08:38.093556414 +0000 UTC m=+969.011917596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/43d29738-353c-4a39-ac31-5df0a1df037d-proxy-tls") pod "sequence-graph-2d44a-d4fd86597-7j4s5" (UID: "43d29738-353c-4a39-ac31-5df0a1df037d") : secret "sequence-graph-2d44a-serving-cert" not found Apr 16 15:08:37.594033 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:37.594016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43d29738-353c-4a39-ac31-5df0a1df037d-openshift-service-ca-bundle\") pod \"sequence-graph-2d44a-d4fd86597-7j4s5\" (UID: \"43d29738-353c-4a39-ac31-5df0a1df037d\") " pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:38.098945 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:38.098901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43d29738-353c-4a39-ac31-5df0a1df037d-proxy-tls\") pod \"sequence-graph-2d44a-d4fd86597-7j4s5\" (UID: \"43d29738-353c-4a39-ac31-5df0a1df037d\") " pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:38.101287 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:38.101263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43d29738-353c-4a39-ac31-5df0a1df037d-proxy-tls\") pod \"sequence-graph-2d44a-d4fd86597-7j4s5\" (UID: \"43d29738-353c-4a39-ac31-5df0a1df037d\") " pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:38.252237 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:38.252199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:38.374683 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:38.374608 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5"] Apr 16 15:08:38.377787 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:08:38.377733 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43d29738_353c_4a39_ac31_5df0a1df037d.slice/crio-bf64e63e324a5dd6b9b8f99c0dbc87f40af13b57efe4e91053d1a28babac7840 WatchSource:0}: Error finding container bf64e63e324a5dd6b9b8f99c0dbc87f40af13b57efe4e91053d1a28babac7840: Status 404 returned error can't find the container with id bf64e63e324a5dd6b9b8f99c0dbc87f40af13b57efe4e91053d1a28babac7840 Apr 16 15:08:38.984264 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:38.984225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" event={"ID":"43d29738-353c-4a39-ac31-5df0a1df037d","Type":"ContainerStarted","Data":"cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a"} Apr 16 15:08:38.984264 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:38.984263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" event={"ID":"43d29738-353c-4a39-ac31-5df0a1df037d","Type":"ContainerStarted","Data":"bf64e63e324a5dd6b9b8f99c0dbc87f40af13b57efe4e91053d1a28babac7840"} Apr 16 15:08:38.984499 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:38.984356 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:08:38.998387 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:38.998334 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" podStartSLOduration=1.9983174240000001 podStartE2EDuration="1.998317424s" podCreationTimestamp="2026-04-16 15:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:08:38.997524889 +0000 UTC m=+969.915886096" watchObservedRunningTime="2026-04-16 15:08:38.998317424 +0000 UTC m=+969.916678628" Apr 16 15:08:44.992973 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:08:44.992944 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:12:29.735067 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:12:29.735040 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:12:29.737564 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:12:29.737545 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:16:12.151124 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.151090 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4"] Apr 16 15:16:12.151601 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.151336 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" containerID="cri-o://f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818" gracePeriod=30 Apr 16 15:16:12.280022 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.279985 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7"] Apr 16 15:16:12.280237 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.280216 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" containerID="cri-o://82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919" gracePeriod=30 Apr 16 15:16:12.357291 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.357260 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x"] Apr 16 15:16:12.360605 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.360589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" Apr 16 15:16:12.366393 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.366359 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x"] Apr 16 15:16:12.370892 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.370873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" Apr 16 15:16:12.495694 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.495625 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x"] Apr 16 15:16:12.499076 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:16:12.499043 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1e28c84_038a_4b26_bbf7_d6551e412327.slice/crio-55ddd6afcdcd12c9c4f5841d05bb752770b5194bcb5ff27a984b506191e99b4b WatchSource:0}: Error finding container 55ddd6afcdcd12c9c4f5841d05bb752770b5194bcb5ff27a984b506191e99b4b: Status 404 returned error can't find the container with id 55ddd6afcdcd12c9c4f5841d05bb752770b5194bcb5ff27a984b506191e99b4b Apr 16 15:16:12.500869 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.500847 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:16:12.565008 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.564984 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" event={"ID":"b1e28c84-038a-4b26-bbf7-d6551e412327","Type":"ContainerStarted","Data":"ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0"} Apr 16 15:16:12.565109 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.565017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" event={"ID":"b1e28c84-038a-4b26-bbf7-d6551e412327","Type":"ContainerStarted","Data":"55ddd6afcdcd12c9c4f5841d05bb752770b5194bcb5ff27a984b506191e99b4b"} Apr 16 15:16:12.565160 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.565127 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" Apr 16 15:16:12.566215 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.566193 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:16:12.578305 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:12.578267 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" podStartSLOduration=0.578255776 podStartE2EDuration="578.255776ms" podCreationTimestamp="2026-04-16 15:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:12.578221298 +0000 UTC m=+1423.496582499" watchObservedRunningTime="2026-04-16 15:16:12.578255776 +0000 UTC m=+1423.496616981" Apr 16 15:16:13.568261 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:13.568220 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:16:14.851549 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:14.851501 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:16:15.331152 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.331132 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" Apr 16 15:16:15.576882 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.576794 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerID="82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919" exitCode=0 Apr 16 15:16:15.576882 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.576856 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" Apr 16 15:16:15.576882 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.576861 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" event={"ID":"0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a","Type":"ContainerDied","Data":"82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919"} Apr 16 15:16:15.577098 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.576900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7" event={"ID":"0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a","Type":"ContainerDied","Data":"d650ba4bf36e32cd6ad42fac579f2da83d83f51b46c9af59471c668c12b72523"} Apr 16 15:16:15.577098 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.576920 2576 scope.go:117] "RemoveContainer" containerID="82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919" Apr 16 15:16:15.585556 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.585527 2576 scope.go:117] "RemoveContainer" containerID="82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919" Apr 16 15:16:15.585825 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:16:15.585805 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919\": container with ID starting with 82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919 not found: ID does not exist" containerID="82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919" Apr 16 15:16:15.585897 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.585834 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919"} err="failed to get container status \"82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919\": rpc error: code = NotFound desc = could not find container \"82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919\": container with ID starting with 82298c9f0847228190fe92b3606bfb727418c2901b6ca1b28c3c3deba2f3f919 not found: ID does not exist" Apr 16 15:16:15.595762 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.595740 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7"] Apr 16 15:16:15.598883 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.598865 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f8b-predictor-667b95c6b8-wknl7"] Apr 16 15:16:15.806840 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:15.806809 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" path="/var/lib/kubelet/pods/0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a/volumes" Apr 16 15:16:19.851248 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:19.851211 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:16:23.568513 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:23.568474 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:16:24.851398 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:24.851355 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:16:24.851844 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:24.851486 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:16:29.852106 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:29.852062 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:16:33.569166 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:33.569128 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:16:34.851576 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:34.851533 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:16:39.851717 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:39.851680 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:16:42.301882 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.301854 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:16:42.445394 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.445300 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-openshift-service-ca-bundle\") pod \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\" (UID: \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\") " Apr 16 15:16:42.445394 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.445355 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-proxy-tls\") pod \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\" (UID: \"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53\") " Apr 16 15:16:42.445692 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.445665 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" (UID: "d6eed1c0-c2e6-44d1-a359-2a5495a3fb53"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:42.447448 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.447428 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" (UID: "d6eed1c0-c2e6-44d1-a359-2a5495a3fb53"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:42.546623 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.546579 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:16:42.546623 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.546614 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:16:42.672721 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.672685 2576 generic.go:358] "Generic (PLEG): container finished" podID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerID="f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818" exitCode=0 Apr 16 15:16:42.672934 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.672757 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" Apr 16 15:16:42.672934 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.672783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" event={"ID":"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53","Type":"ContainerDied","Data":"f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818"} Apr 16 15:16:42.672934 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.672817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4" event={"ID":"d6eed1c0-c2e6-44d1-a359-2a5495a3fb53","Type":"ContainerDied","Data":"74a9de0165e6a637bbaa7f6eb09c532f490967bfb3febc8c421b79c4e3543bb3"} Apr 16 15:16:42.672934 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.672833 2576 scope.go:117] "RemoveContainer" containerID="f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818" Apr 16 15:16:42.681671 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.681651 2576 scope.go:117] "RemoveContainer" containerID="f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818" Apr 16 15:16:42.681989 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:16:42.681966 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818\": container with ID starting with f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818 not found: ID does not exist" containerID="f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818" Apr 16 15:16:42.682062 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.681999 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818"} err="failed to get container status \"f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818\": rpc error: code = NotFound desc = could not find container \"f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818\": container with ID starting with f83d6669ce966bacf56c9c64a3c6f1f5cb56db827237bfff65ad6ec974270818 not found: ID does not exist" Apr 16 15:16:42.692651 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.692624 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4"] Apr 16 15:16:42.695924 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:42.695860 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c9f8b-5c958b9899-79ld4"] Apr 16 15:16:43.569282 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:43.569237 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:16:43.806174 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:43.806139 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" path="/var/lib/kubelet/pods/d6eed1c0-c2e6-44d1-a359-2a5495a3fb53/volumes" Apr 16 15:16:52.039536 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.039499 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5"] Apr 16 15:16:52.039910 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.039736 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" containerID="cri-o://cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a" gracePeriod=30 Apr 16 15:16:52.214278 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.214243 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv"] Apr 16 15:16:52.214510 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.214488 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" containerID="cri-o://3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1" gracePeriod=30 Apr 16 15:16:52.233778 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.233732 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5"] Apr 16 15:16:52.234141 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.234128 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" Apr 16 15:16:52.234183 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.234143 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" Apr 16 15:16:52.234183 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.234165 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" Apr 16 15:16:52.234183 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.234171 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" Apr 16 15:16:52.234276 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.234226 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6eed1c0-c2e6-44d1-a359-2a5495a3fb53" containerName="switch-graph-c9f8b" Apr 16 15:16:52.234276 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.234237 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ad5a8c3-7ac8-4abc-9aaf-93066d37d18a" containerName="kserve-container" Apr 16 15:16:52.238466 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.238449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" Apr 16 15:16:52.242928 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.242905 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5"] Apr 16 15:16:52.248654 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.248637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" Apr 16 15:16:52.371353 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.371230 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5"] Apr 16 15:16:52.374255 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:16:52.374224 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e59109_5394_4ef9_bf06_0bfc70ce9b63.slice/crio-6465fecf0103aeab7327a4b76ec9ebf4caada093eb991af129fcf237ac7bcef8 WatchSource:0}: Error finding container 6465fecf0103aeab7327a4b76ec9ebf4caada093eb991af129fcf237ac7bcef8: Status 404 returned error can't find the container with id 6465fecf0103aeab7327a4b76ec9ebf4caada093eb991af129fcf237ac7bcef8 Apr 16 15:16:52.712993 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.712948 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" event={"ID":"a7e59109-5394-4ef9-bf06-0bfc70ce9b63","Type":"ContainerStarted","Data":"6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43"} Apr 16 15:16:52.713167 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.713000 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" Apr 16 15:16:52.713167 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.713015 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" event={"ID":"a7e59109-5394-4ef9-bf06-0bfc70ce9b63","Type":"ContainerStarted","Data":"6465fecf0103aeab7327a4b76ec9ebf4caada093eb991af129fcf237ac7bcef8"} Apr 16 15:16:52.714569 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.714541 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:16:52.725351 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:52.725302 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" podStartSLOduration=0.72528472 podStartE2EDuration="725.28472ms" podCreationTimestamp="2026-04-16 15:16:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:52.724679453 +0000 UTC m=+1463.643040658" watchObservedRunningTime="2026-04-16 15:16:52.72528472 +0000 UTC m=+1463.643645926" Apr 16 15:16:53.569251 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:53.569205 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:16:53.716653 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:53.716612 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:16:54.991298 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:54.991263 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:16:55.358908 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.358885 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" Apr 16 15:16:55.723687 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.723654 2576 generic.go:358] "Generic (PLEG): container finished" podID="30de4458-740a-4852-bfa7-6a75cea3e464" containerID="3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1" exitCode=0 Apr 16 15:16:55.723899 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.723693 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" event={"ID":"30de4458-740a-4852-bfa7-6a75cea3e464","Type":"ContainerDied","Data":"3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1"} Apr 16 15:16:55.723899 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.723714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" event={"ID":"30de4458-740a-4852-bfa7-6a75cea3e464","Type":"ContainerDied","Data":"ad7cfda41d422abded963eef87bcf9482b8c79a7d17a22ac3d6b1f1cd6607c41"} Apr 16 15:16:55.723899 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.723712 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv" Apr 16 15:16:55.723899 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.723724 2576 scope.go:117] "RemoveContainer" containerID="3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1" Apr 16 15:16:55.734010 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.733995 2576 scope.go:117] "RemoveContainer" containerID="3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1" Apr 16 15:16:55.734266 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:16:55.734250 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1\": container with ID starting with 3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1 not found: ID does not exist" containerID="3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1" Apr 16 15:16:55.734304 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.734274 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1"} err="failed to get container status \"3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1\": rpc error: code = NotFound desc = could not find container \"3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1\": container with ID starting with 3f7fe703f51e019674867e038a05b5d7e0864137a2af6e6ecbe20c0ae18b3ea1 not found: ID does not exist" Apr 16 15:16:55.743664 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.743643 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv"] Apr 16 15:16:55.748638 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.748617 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d44a-predictor-6bd887658-bj6zv"] Apr 16 15:16:55.805992 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:55.805962 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" path="/var/lib/kubelet/pods/30de4458-740a-4852-bfa7-6a75cea3e464/volumes" Apr 16 15:16:59.991039 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:16:59.991002 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:03.570261 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:03.570228 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" Apr 16 15:17:03.716825 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:03.716755 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:17:04.991968 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:04.991927 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:04.992426 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:04.992055 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:17:09.991611 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:09.991571 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:12.365130 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.365094 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx"] Apr 16 15:17:12.365643 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.365624 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" Apr 16 15:17:12.365728 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.365647 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" Apr 16 15:17:12.365807 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.365742 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="30de4458-740a-4852-bfa7-6a75cea3e464" containerName="kserve-container" Apr 16 15:17:12.370003 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.369981 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:12.371908 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.371882 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-36a90-serving-cert\"" Apr 16 15:17:12.372003 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.371920 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-36a90-kube-rbac-proxy-sar-config\"" Apr 16 15:17:12.374869 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.374848 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx"] Apr 16 15:17:12.454424 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.454392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a37917e7-eccc-497b-a65d-0c59e8801670-proxy-tls\") pod \"ensemble-graph-36a90-744c45996f-jphzx\" (UID: \"a37917e7-eccc-497b-a65d-0c59e8801670\") " pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:12.454596 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.454440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37917e7-eccc-497b-a65d-0c59e8801670-openshift-service-ca-bundle\") pod \"ensemble-graph-36a90-744c45996f-jphzx\" (UID: \"a37917e7-eccc-497b-a65d-0c59e8801670\") " pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:12.554888 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.554854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a37917e7-eccc-497b-a65d-0c59e8801670-proxy-tls\") pod \"ensemble-graph-36a90-744c45996f-jphzx\" (UID: \"a37917e7-eccc-497b-a65d-0c59e8801670\") " pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:12.555033 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.554913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37917e7-eccc-497b-a65d-0c59e8801670-openshift-service-ca-bundle\") pod \"ensemble-graph-36a90-744c45996f-jphzx\" (UID: \"a37917e7-eccc-497b-a65d-0c59e8801670\") " pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:12.555033 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:17:12.555006 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-36a90-serving-cert: secret "ensemble-graph-36a90-serving-cert" not found Apr 16 15:17:12.555116 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:17:12.555081 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a37917e7-eccc-497b-a65d-0c59e8801670-proxy-tls podName:a37917e7-eccc-497b-a65d-0c59e8801670 nodeName:}" failed. No retries permitted until 2026-04-16 15:17:13.055063899 +0000 UTC m=+1483.973425086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a37917e7-eccc-497b-a65d-0c59e8801670-proxy-tls") pod "ensemble-graph-36a90-744c45996f-jphzx" (UID: "a37917e7-eccc-497b-a65d-0c59e8801670") : secret "ensemble-graph-36a90-serving-cert" not found Apr 16 15:17:12.555576 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:12.555554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37917e7-eccc-497b-a65d-0c59e8801670-openshift-service-ca-bundle\") pod \"ensemble-graph-36a90-744c45996f-jphzx\" (UID: \"a37917e7-eccc-497b-a65d-0c59e8801670\") " pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:13.060440 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:13.060403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a37917e7-eccc-497b-a65d-0c59e8801670-proxy-tls\") pod \"ensemble-graph-36a90-744c45996f-jphzx\" (UID: \"a37917e7-eccc-497b-a65d-0c59e8801670\") " pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:13.062728 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:13.062708 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a37917e7-eccc-497b-a65d-0c59e8801670-proxy-tls\") pod \"ensemble-graph-36a90-744c45996f-jphzx\" (UID: \"a37917e7-eccc-497b-a65d-0c59e8801670\") " pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:13.281554 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:13.281510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:13.405475 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:13.405438 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx"] Apr 16 15:17:13.408363 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:17:13.408337 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda37917e7_eccc_497b_a65d_0c59e8801670.slice/crio-2c0edc7a7cf009a800d4673622ea65cb9e1f37ac8d23c40e472e255dc70f0ec0 WatchSource:0}: Error finding container 2c0edc7a7cf009a800d4673622ea65cb9e1f37ac8d23c40e472e255dc70f0ec0: Status 404 returned error can't find the container with id 2c0edc7a7cf009a800d4673622ea65cb9e1f37ac8d23c40e472e255dc70f0ec0 Apr 16 15:17:13.717181 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:13.717135 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:17:13.789682 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:13.789643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" event={"ID":"a37917e7-eccc-497b-a65d-0c59e8801670","Type":"ContainerStarted","Data":"6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93"} Apr 16 15:17:13.789872 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:13.789690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" event={"ID":"a37917e7-eccc-497b-a65d-0c59e8801670","Type":"ContainerStarted","Data":"2c0edc7a7cf009a800d4673622ea65cb9e1f37ac8d23c40e472e255dc70f0ec0"} Apr 16 15:17:13.789872 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:13.789721 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:13.803600 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:13.803555 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" podStartSLOduration=1.803539589 podStartE2EDuration="1.803539589s" podCreationTimestamp="2026-04-16 15:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:17:13.801806394 +0000 UTC m=+1484.720167596" watchObservedRunningTime="2026-04-16 15:17:13.803539589 +0000 UTC m=+1484.721900821" Apr 16 15:17:14.991296 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:14.991253 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:19.798438 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:19.798409 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:19.991832 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:19.991792 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:22.174063 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.174034 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:17:22.239114 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.239083 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43d29738-353c-4a39-ac31-5df0a1df037d-openshift-service-ca-bundle\") pod \"43d29738-353c-4a39-ac31-5df0a1df037d\" (UID: \"43d29738-353c-4a39-ac31-5df0a1df037d\") " Apr 16 15:17:22.239287 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.239131 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43d29738-353c-4a39-ac31-5df0a1df037d-proxy-tls\") pod \"43d29738-353c-4a39-ac31-5df0a1df037d\" (UID: \"43d29738-353c-4a39-ac31-5df0a1df037d\") " Apr 16 15:17:22.239475 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.239450 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d29738-353c-4a39-ac31-5df0a1df037d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "43d29738-353c-4a39-ac31-5df0a1df037d" (UID: "43d29738-353c-4a39-ac31-5df0a1df037d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:17:22.241193 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.241175 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d29738-353c-4a39-ac31-5df0a1df037d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "43d29738-353c-4a39-ac31-5df0a1df037d" (UID: "43d29738-353c-4a39-ac31-5df0a1df037d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:17:22.340455 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.340368 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43d29738-353c-4a39-ac31-5df0a1df037d-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:17:22.340455 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.340400 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43d29738-353c-4a39-ac31-5df0a1df037d-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:17:22.437112 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.437081 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx"] Apr 16 15:17:22.437294 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.437274 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" containerID="cri-o://6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93" gracePeriod=30 Apr 16 15:17:22.608450 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.608370 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x"] Apr 16 15:17:22.608662 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.608638 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" containerID="cri-o://ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0" gracePeriod=30 Apr 16 15:17:22.637239 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.637211 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj"] Apr 16 15:17:22.637592 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.637578 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" Apr 16 15:17:22.637636 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.637596 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" Apr 16 15:17:22.637670 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.637656 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" containerName="sequence-graph-2d44a" Apr 16 15:17:22.640759 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.640742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" Apr 16 15:17:22.646160 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.646133 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj"] Apr 16 15:17:22.651403 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.651383 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" Apr 16 15:17:22.777449 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.777369 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj"] Apr 16 15:17:22.780025 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:17:22.780002 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71002a64_03d0_4265_99f6_b078b62619c5.slice/crio-94fe02d111bb12a6c85e5b081d21264ea7a75550b8cae6c38e7738ed87aeaa55 WatchSource:0}: Error finding container 94fe02d111bb12a6c85e5b081d21264ea7a75550b8cae6c38e7738ed87aeaa55: Status 404 returned error can't find the container with id 94fe02d111bb12a6c85e5b081d21264ea7a75550b8cae6c38e7738ed87aeaa55 Apr 16 15:17:22.818865 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.818802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" event={"ID":"71002a64-03d0-4265-99f6-b078b62619c5","Type":"ContainerStarted","Data":"94fe02d111bb12a6c85e5b081d21264ea7a75550b8cae6c38e7738ed87aeaa55"} Apr 16 15:17:22.820172 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.820145 2576 generic.go:358] "Generic (PLEG): container finished" podID="43d29738-353c-4a39-ac31-5df0a1df037d" containerID="cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a" exitCode=0 Apr 16 15:17:22.820286 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.820207 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" Apr 16 15:17:22.820286 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.820218 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" event={"ID":"43d29738-353c-4a39-ac31-5df0a1df037d","Type":"ContainerDied","Data":"cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a"} Apr 16 15:17:22.820286 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.820256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5" event={"ID":"43d29738-353c-4a39-ac31-5df0a1df037d","Type":"ContainerDied","Data":"bf64e63e324a5dd6b9b8f99c0dbc87f40af13b57efe4e91053d1a28babac7840"} Apr 16 15:17:22.820433 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.820277 2576 scope.go:117] "RemoveContainer" containerID="cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a" Apr 16 15:17:22.845638 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.845622 2576 scope.go:117] "RemoveContainer" containerID="cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a" Apr 16 15:17:22.845950 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:17:22.845919 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a\": container with ID starting with cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a not found: ID does not exist" containerID="cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a" Apr 16 15:17:22.846052 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.845959 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a"} err="failed to get container status \"cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a\": rpc error: code = NotFound desc = could not find container \"cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a\": container with ID starting with cb0362bd6dc782326d7f9dfb6077a4c9d683dce1aad88f81eafaff30a2febb0a not found: ID does not exist" Apr 16 15:17:22.854719 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.854694 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5"] Apr 16 15:17:22.860086 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:22.859870 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-2d44a-d4fd86597-7j4s5"] Apr 16 15:17:23.569190 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:23.569144 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:17:23.716997 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:23.716952 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:17:23.806658 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:23.806622 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d29738-353c-4a39-ac31-5df0a1df037d" path="/var/lib/kubelet/pods/43d29738-353c-4a39-ac31-5df0a1df037d/volumes" Apr 16 15:17:23.825216 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:23.825145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" event={"ID":"71002a64-03d0-4265-99f6-b078b62619c5","Type":"ContainerStarted","Data":"ddf31bd24f8cc71b1d71396ba6a5e972361ccde049c02f48f27d622cc6ca52ff"} Apr 16 15:17:23.825468 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:23.825444 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" Apr 16 15:17:23.826612 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:23.826589 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 15:17:23.838607 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:23.838565 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" podStartSLOduration=1.838551673 podStartE2EDuration="1.838551673s" podCreationTimestamp="2026-04-16 15:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:17:23.837662506 +0000 UTC m=+1494.756023710" watchObservedRunningTime="2026-04-16 15:17:23.838551673 +0000 UTC m=+1494.756912878" Apr 16 15:17:24.797609 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:24.797570 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:24.829048 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:24.829008 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 15:17:25.762705 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.762682 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" Apr 16 15:17:25.833066 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.833037 2576 generic.go:358] "Generic (PLEG): container finished" podID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerID="ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0" exitCode=0 Apr 16 15:17:25.833433 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.833099 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" Apr 16 15:17:25.833433 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.833131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" event={"ID":"b1e28c84-038a-4b26-bbf7-d6551e412327","Type":"ContainerDied","Data":"ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0"} Apr 16 15:17:25.833433 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.833164 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x" event={"ID":"b1e28c84-038a-4b26-bbf7-d6551e412327","Type":"ContainerDied","Data":"55ddd6afcdcd12c9c4f5841d05bb752770b5194bcb5ff27a984b506191e99b4b"} Apr 16 15:17:25.833433 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.833179 2576 scope.go:117] "RemoveContainer" containerID="ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0" Apr 16 15:17:25.841311 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.841291 2576 scope.go:117] "RemoveContainer" containerID="ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0" Apr 16 15:17:25.841563 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:17:25.841543 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0\": container with ID starting with ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0 not found: ID does not exist" containerID="ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0" Apr 16 15:17:25.841622 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.841575 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0"} err="failed to get container status \"ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0\": rpc error: code = NotFound desc = could not find container \"ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0\": container with ID starting with ffec4fd18ebfd553ac18a8dbd28c82c36ddaff5bddd0b9117934c11769b54ee0 not found: ID does not exist" Apr 16 15:17:25.848358 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.848340 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x"] Apr 16 15:17:25.851533 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:25.851512 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-36a90-predictor-574bcd5574-cmn2x"] Apr 16 15:17:27.806225 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:27.806191 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" path="/var/lib/kubelet/pods/b1e28c84-038a-4b26-bbf7-d6551e412327/volumes" Apr 16 15:17:29.761699 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:29.761670 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:17:29.765111 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:29.765090 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:17:29.797608 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:29.797577 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:33.716788 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:33.716729 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:17:34.797656 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:34.797616 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:34.798054 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:34.797716 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:34.829900 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:34.829867 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 15:17:39.797210 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:39.797153 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:43.717820 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:43.717758 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" Apr 16 15:17:44.798087 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:44.798047 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:44.829540 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:44.829504 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 15:17:49.798051 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:49.798014 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:17:52.467662 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:17:52.467624 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda37917e7_eccc_497b_a65d_0c59e8801670.slice/crio-6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:17:52.468043 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:17:52.467695 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda37917e7_eccc_497b_a65d_0c59e8801670.slice/crio-6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:17:52.468043 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:17:52.467721 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda37917e7_eccc_497b_a65d_0c59e8801670.slice/crio-conmon-6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:17:52.468043 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:17:52.467686 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda37917e7_eccc_497b_a65d_0c59e8801670.slice/crio-6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda37917e7_eccc_497b_a65d_0c59e8801670.slice/crio-conmon-6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:17:52.589251 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.589225 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:52.701396 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.701300 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37917e7-eccc-497b-a65d-0c59e8801670-openshift-service-ca-bundle\") pod \"a37917e7-eccc-497b-a65d-0c59e8801670\" (UID: \"a37917e7-eccc-497b-a65d-0c59e8801670\") " Apr 16 15:17:52.701396 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.701362 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a37917e7-eccc-497b-a65d-0c59e8801670-proxy-tls\") pod \"a37917e7-eccc-497b-a65d-0c59e8801670\" (UID: \"a37917e7-eccc-497b-a65d-0c59e8801670\") " Apr 16 15:17:52.701672 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.701647 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37917e7-eccc-497b-a65d-0c59e8801670-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "a37917e7-eccc-497b-a65d-0c59e8801670" (UID: "a37917e7-eccc-497b-a65d-0c59e8801670"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:17:52.703399 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.703380 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a37917e7-eccc-497b-a65d-0c59e8801670-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a37917e7-eccc-497b-a65d-0c59e8801670" (UID: "a37917e7-eccc-497b-a65d-0c59e8801670"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:17:52.802025 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.801993 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37917e7-eccc-497b-a65d-0c59e8801670-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:17:52.802025 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.802022 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a37917e7-eccc-497b-a65d-0c59e8801670-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:17:52.927906 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.927873 2576 generic.go:358] "Generic (PLEG): container finished" podID="a37917e7-eccc-497b-a65d-0c59e8801670" containerID="6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93" exitCode=0 Apr 16 15:17:52.928065 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.927936 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" Apr 16 15:17:52.928065 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.927940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" event={"ID":"a37917e7-eccc-497b-a65d-0c59e8801670","Type":"ContainerDied","Data":"6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93"} Apr 16 15:17:52.928065 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.928030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx" event={"ID":"a37917e7-eccc-497b-a65d-0c59e8801670","Type":"ContainerDied","Data":"2c0edc7a7cf009a800d4673622ea65cb9e1f37ac8d23c40e472e255dc70f0ec0"} Apr 16 15:17:52.928065 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.928045 2576 scope.go:117] "RemoveContainer" containerID="6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93" Apr 16 15:17:52.936518 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.936493 2576 scope.go:117] "RemoveContainer" containerID="6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93" Apr 16 15:17:52.936748 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:17:52.936730 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93\": container with ID starting with 6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93 not found: ID does not exist" containerID="6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93" Apr 16 15:17:52.936823 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.936754 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93"} err="failed to get container status \"6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93\": rpc error: code = NotFound desc = could not find container \"6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93\": container with ID starting with 6b2ce54b22dbc3bb5c2ffdbcf54c6ad014121a14a5b880f9c2e45c463d527c93 not found: ID does not exist" Apr 16 15:17:52.947165 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.947141 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx"] Apr 16 15:17:52.952021 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:52.951965 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-36a90-744c45996f-jphzx"] Apr 16 15:17:53.806309 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:53.806280 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" path="/var/lib/kubelet/pods/a37917e7-eccc-497b-a65d-0c59e8801670/volumes" Apr 16 15:17:54.829213 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:17:54.829150 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 15:18:02.211503 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.211472 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf"] Apr 16 15:18:02.211893 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.211826 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" Apr 16 15:18:02.211893 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.211838 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" Apr 16 15:18:02.211893 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.211849 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" Apr 16 15:18:02.211893 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.211855 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" Apr 16 15:18:02.212047 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.211913 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a37917e7-eccc-497b-a65d-0c59e8801670" containerName="ensemble-graph-36a90" Apr 16 15:18:02.212047 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.211922 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1e28c84-038a-4b26-bbf7-d6551e412327" containerName="kserve-container" Apr 16 15:18:02.215123 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.215107 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:02.216962 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.216912 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-b8642-serving-cert\"" Apr 16 15:18:02.217097 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.216984 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:18:02.217177 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.217163 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-b8642-kube-rbac-proxy-sar-config\"" Apr 16 15:18:02.221454 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.221431 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf"] Apr 16 15:18:02.385797 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.385749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/491a067f-494b-47da-89e3-cc50723f493a-proxy-tls\") pod \"sequence-graph-b8642-bbb4f56f-n44lf\" (UID: \"491a067f-494b-47da-89e3-cc50723f493a\") " pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:02.385968 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.385844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491a067f-494b-47da-89e3-cc50723f493a-openshift-service-ca-bundle\") pod \"sequence-graph-b8642-bbb4f56f-n44lf\" (UID: \"491a067f-494b-47da-89e3-cc50723f493a\") " pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:02.487228 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.487152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/491a067f-494b-47da-89e3-cc50723f493a-proxy-tls\") pod \"sequence-graph-b8642-bbb4f56f-n44lf\" (UID: \"491a067f-494b-47da-89e3-cc50723f493a\") " pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:02.487228 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.487200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491a067f-494b-47da-89e3-cc50723f493a-openshift-service-ca-bundle\") pod \"sequence-graph-b8642-bbb4f56f-n44lf\" (UID: \"491a067f-494b-47da-89e3-cc50723f493a\") " pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:02.487818 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.487742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491a067f-494b-47da-89e3-cc50723f493a-openshift-service-ca-bundle\") pod \"sequence-graph-b8642-bbb4f56f-n44lf\" (UID: \"491a067f-494b-47da-89e3-cc50723f493a\") " pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:02.489758 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.489735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/491a067f-494b-47da-89e3-cc50723f493a-proxy-tls\") pod \"sequence-graph-b8642-bbb4f56f-n44lf\" (UID: \"491a067f-494b-47da-89e3-cc50723f493a\") " pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:02.525642 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.525612 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:02.642813 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.642788 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf"] Apr 16 15:18:02.645575 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:18:02.645540 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491a067f_494b_47da_89e3_cc50723f493a.slice/crio-1694b515bead63cfc22b248b5052ebf2396ce8a098aba013d70b52ddced76949 WatchSource:0}: Error finding container 1694b515bead63cfc22b248b5052ebf2396ce8a098aba013d70b52ddced76949: Status 404 returned error can't find the container with id 1694b515bead63cfc22b248b5052ebf2396ce8a098aba013d70b52ddced76949 Apr 16 15:18:02.962959 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.962922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" event={"ID":"491a067f-494b-47da-89e3-cc50723f493a","Type":"ContainerStarted","Data":"a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f"} Apr 16 15:18:02.962959 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.962962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" event={"ID":"491a067f-494b-47da-89e3-cc50723f493a","Type":"ContainerStarted","Data":"1694b515bead63cfc22b248b5052ebf2396ce8a098aba013d70b52ddced76949"} Apr 16 15:18:02.963160 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.963053 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:02.976526 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:02.976423 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" podStartSLOduration=0.976404457 podStartE2EDuration="976.404457ms" podCreationTimestamp="2026-04-16 15:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:18:02.975816345 +0000 UTC m=+1533.894177549" watchObservedRunningTime="2026-04-16 15:18:02.976404457 +0000 UTC m=+1533.894765663" Apr 16 15:18:04.829685 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:04.829647 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 15:18:08.971802 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:08.971755 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:12.291964 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.291931 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf"] Apr 16 15:18:12.292320 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.292146 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" containerID="cri-o://a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f" gracePeriod=30 Apr 16 15:18:12.501464 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.501429 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5"] Apr 16 15:18:12.501696 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.501670 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" containerID="cri-o://6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43" gracePeriod=30 Apr 16 15:18:12.539298 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.536808 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8"] Apr 16 15:18:12.541576 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.541547 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" Apr 16 15:18:12.544166 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.544102 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8"] Apr 16 15:18:12.551688 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.551668 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" Apr 16 15:18:12.676719 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.676697 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8"] Apr 16 15:18:12.679271 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:18:12.679240 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedebcbcd_363d_4468_880a_45829d6ec699.slice/crio-5d4286f4b6f136f898a6d15f68ab4f43a6aea92d46c047f246772d5d94ba032c WatchSource:0}: Error finding container 5d4286f4b6f136f898a6d15f68ab4f43a6aea92d46c047f246772d5d94ba032c: Status 404 returned error can't find the container with id 5d4286f4b6f136f898a6d15f68ab4f43a6aea92d46c047f246772d5d94ba032c Apr 16 15:18:12.997488 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.997452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" event={"ID":"edebcbcd-363d-4468-880a-45829d6ec699","Type":"ContainerStarted","Data":"1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265"} Apr 16 15:18:12.997488 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.997489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" event={"ID":"edebcbcd-363d-4468-880a-45829d6ec699","Type":"ContainerStarted","Data":"5d4286f4b6f136f898a6d15f68ab4f43a6aea92d46c047f246772d5d94ba032c"} Apr 16 15:18:12.997730 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.997625 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" Apr 16 15:18:12.998923 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:12.998902 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 15:18:13.011312 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:13.011267 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" podStartSLOduration=1.011255437 podStartE2EDuration="1.011255437s" podCreationTimestamp="2026-04-16 15:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:18:13.00963811 +0000 UTC m=+1543.927999319" watchObservedRunningTime="2026-04-16 15:18:13.011255437 +0000 UTC m=+1543.929616639" Apr 16 15:18:13.717402 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:13.717361 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:18:13.970584 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:13.970504 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:18:14.001358 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:14.001322 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 15:18:14.829924 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:14.829885 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" Apr 16 15:18:15.544072 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:15.544050 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" Apr 16 15:18:16.008424 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:16.008391 2576 generic.go:358] "Generic (PLEG): container finished" podID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerID="6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43" exitCode=0 Apr 16 15:18:16.008847 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:16.008452 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" Apr 16 15:18:16.008847 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:16.008469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" event={"ID":"a7e59109-5394-4ef9-bf06-0bfc70ce9b63","Type":"ContainerDied","Data":"6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43"} Apr 16 15:18:16.008847 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:16.008507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5" event={"ID":"a7e59109-5394-4ef9-bf06-0bfc70ce9b63","Type":"ContainerDied","Data":"6465fecf0103aeab7327a4b76ec9ebf4caada093eb991af129fcf237ac7bcef8"} Apr 16 15:18:16.008847 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:16.008525 2576 scope.go:117] "RemoveContainer" containerID="6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43" Apr 16 15:18:16.016300 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:16.016282 2576 scope.go:117] "RemoveContainer" containerID="6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43" Apr 16 15:18:16.016521 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:18:16.016504 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43\": container with ID starting with 6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43 not found: ID does not exist" containerID="6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43" Apr 16 15:18:16.016578 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:16.016528 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43"} err="failed to get container status \"6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43\": rpc error: code = NotFound desc = could not find container \"6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43\": container with ID starting with 6e01b3935cd4c0af555e1cacaee942869624c9fe9142fa2b523d4ef205660b43 not found: ID does not exist" Apr 16 15:18:16.022565 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:16.022545 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5"] Apr 16 15:18:16.025336 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:16.025317 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8642-predictor-54bb7c4df6-kbbt5"] Apr 16 15:18:17.807569 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:17.807525 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" path="/var/lib/kubelet/pods/a7e59109-5394-4ef9-bf06-0bfc70ce9b63/volumes" Apr 16 15:18:18.970289 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:18.970245 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:18:22.639066 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.639035 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb"] Apr 16 15:18:22.639418 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.639391 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" Apr 16 15:18:22.639418 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.639401 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" Apr 16 15:18:22.639487 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.639462 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7e59109-5394-4ef9-bf06-0bfc70ce9b63" containerName="kserve-container" Apr 16 15:18:22.643836 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.643818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:22.645785 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.645750 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-b2be2-serving-cert\"" Apr 16 15:18:22.645913 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.645826 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-b2be2-kube-rbac-proxy-sar-config\"" Apr 16 15:18:22.649033 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.649013 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb"] Apr 16 15:18:22.748744 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.748712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3951c5-9ed2-4522-be4f-8b9b4466e37b-openshift-service-ca-bundle\") pod \"ensemble-graph-b2be2-6df585b7fc-bqfmb\" (UID: \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\") " pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:22.748744 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.748752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be3951c5-9ed2-4522-be4f-8b9b4466e37b-proxy-tls\") pod \"ensemble-graph-b2be2-6df585b7fc-bqfmb\" (UID: \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\") " pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:22.850229 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.850199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3951c5-9ed2-4522-be4f-8b9b4466e37b-openshift-service-ca-bundle\") pod \"ensemble-graph-b2be2-6df585b7fc-bqfmb\" (UID: \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\") " pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:22.850414 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.850248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be3951c5-9ed2-4522-be4f-8b9b4466e37b-proxy-tls\") pod \"ensemble-graph-b2be2-6df585b7fc-bqfmb\" (UID: \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\") " pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:22.850899 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.850876 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3951c5-9ed2-4522-be4f-8b9b4466e37b-openshift-service-ca-bundle\") pod \"ensemble-graph-b2be2-6df585b7fc-bqfmb\" (UID: \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\") " pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:22.852805 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.852757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be3951c5-9ed2-4522-be4f-8b9b4466e37b-proxy-tls\") pod \"ensemble-graph-b2be2-6df585b7fc-bqfmb\" (UID: \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\") " pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:22.955078 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:22.955037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:23.079366 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:23.079262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb"] Apr 16 15:18:23.081569 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:18:23.081539 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe3951c5_9ed2_4522_be4f_8b9b4466e37b.slice/crio-4d382bd2f45afcba52944ca3a5503ffa75fb7790937c6c8c8712a2ea63be44bb WatchSource:0}: Error finding container 4d382bd2f45afcba52944ca3a5503ffa75fb7790937c6c8c8712a2ea63be44bb: Status 404 returned error can't find the container with id 4d382bd2f45afcba52944ca3a5503ffa75fb7790937c6c8c8712a2ea63be44bb Apr 16 15:18:23.970324 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:23.970286 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:18:23.970755 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:23.970404 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:24.002047 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:24.002006 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 15:18:24.037113 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:24.037081 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" event={"ID":"be3951c5-9ed2-4522-be4f-8b9b4466e37b","Type":"ContainerStarted","Data":"2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e"} Apr 16 15:18:24.037113 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:24.037110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" event={"ID":"be3951c5-9ed2-4522-be4f-8b9b4466e37b","Type":"ContainerStarted","Data":"4d382bd2f45afcba52944ca3a5503ffa75fb7790937c6c8c8712a2ea63be44bb"} Apr 16 15:18:24.037342 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:24.037234 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:24.053119 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:24.053075 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" podStartSLOduration=2.053061904 podStartE2EDuration="2.053061904s" podCreationTimestamp="2026-04-16 15:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:18:24.051468407 +0000 UTC m=+1554.969829613" watchObservedRunningTime="2026-04-16 15:18:24.053061904 +0000 UTC m=+1554.971423109" Apr 16 15:18:28.970687 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:28.970646 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:18:30.046909 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:30.046882 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:18:33.970259 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:33.970225 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:18:34.002351 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:34.002316 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 15:18:38.970338 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:38.970295 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:18:42.317012 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:18:42.316980 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491a067f_494b_47da_89e3_cc50723f493a.slice/crio-1694b515bead63cfc22b248b5052ebf2396ce8a098aba013d70b52ddced76949\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491a067f_494b_47da_89e3_cc50723f493a.slice/crio-a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491a067f_494b_47da_89e3_cc50723f493a.slice/crio-conmon-a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:18:42.317359 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:18:42.317018 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491a067f_494b_47da_89e3_cc50723f493a.slice/crio-a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491a067f_494b_47da_89e3_cc50723f493a.slice/crio-conmon-a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:18:42.317359 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:18:42.317033 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491a067f_494b_47da_89e3_cc50723f493a.slice/crio-conmon-a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:18:42.453337 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:42.453307 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:42.622551 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:42.622450 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/491a067f-494b-47da-89e3-cc50723f493a-proxy-tls\") pod \"491a067f-494b-47da-89e3-cc50723f493a\" (UID: \"491a067f-494b-47da-89e3-cc50723f493a\") " Apr 16 15:18:42.622551 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:42.622500 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491a067f-494b-47da-89e3-cc50723f493a-openshift-service-ca-bundle\") pod \"491a067f-494b-47da-89e3-cc50723f493a\" (UID: \"491a067f-494b-47da-89e3-cc50723f493a\") " Apr 16 15:18:42.622907 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:42.622883 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491a067f-494b-47da-89e3-cc50723f493a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "491a067f-494b-47da-89e3-cc50723f493a" (UID: "491a067f-494b-47da-89e3-cc50723f493a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:18:42.624523 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:42.624500 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491a067f-494b-47da-89e3-cc50723f493a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "491a067f-494b-47da-89e3-cc50723f493a" (UID: "491a067f-494b-47da-89e3-cc50723f493a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:18:42.723142 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:42.723108 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/491a067f-494b-47da-89e3-cc50723f493a-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:18:42.723142 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:42.723137 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491a067f-494b-47da-89e3-cc50723f493a-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:18:43.108576 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.108544 2576 generic.go:358] "Generic (PLEG): container finished" podID="491a067f-494b-47da-89e3-cc50723f493a" containerID="a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f" exitCode=0 Apr 16 15:18:43.108798 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.108605 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" Apr 16 15:18:43.108798 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.108631 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" event={"ID":"491a067f-494b-47da-89e3-cc50723f493a","Type":"ContainerDied","Data":"a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f"} Apr 16 15:18:43.108798 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.108672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf" event={"ID":"491a067f-494b-47da-89e3-cc50723f493a","Type":"ContainerDied","Data":"1694b515bead63cfc22b248b5052ebf2396ce8a098aba013d70b52ddced76949"} Apr 16 15:18:43.108798 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.108690 2576 scope.go:117] "RemoveContainer" containerID="a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f" Apr 16 15:18:43.117428 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.117413 2576 scope.go:117] "RemoveContainer" containerID="a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f" Apr 16 15:18:43.117648 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:18:43.117633 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f\": container with ID starting with a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f not found: ID does not exist" containerID="a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f" Apr 16 15:18:43.117715 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.117655 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f"} err="failed to get container status \"a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f\": rpc error: code = NotFound desc = could not find container \"a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f\": container with ID starting with a5f32d45633b5766f7c461d0c3738dbf1bc0da032f6125150b5151b950baee2f not found: ID does not exist" Apr 16 15:18:43.128212 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.128188 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf"] Apr 16 15:18:43.131404 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.131384 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b8642-bbb4f56f-n44lf"] Apr 16 15:18:43.806697 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:43.806656 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="491a067f-494b-47da-89e3-cc50723f493a" path="/var/lib/kubelet/pods/491a067f-494b-47da-89e3-cc50723f493a/volumes" Apr 16 15:18:44.002125 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:44.002084 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 15:18:54.002095 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:18:54.002046 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 15:19:04.002937 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:04.002904 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" Apr 16 15:19:12.492259 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.492219 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v"] Apr 16 15:19:12.492832 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.492788 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" Apr 16 15:19:12.492832 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.492809 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" Apr 16 15:19:12.492973 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.492961 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="491a067f-494b-47da-89e3-cc50723f493a" containerName="sequence-graph-b8642" Apr 16 15:19:12.496075 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.496054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:12.497868 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.497843 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-cee76-kube-rbac-proxy-sar-config\"" Apr 16 15:19:12.497969 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.497845 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-cee76-serving-cert\"" Apr 16 15:19:12.500815 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.500792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v"] Apr 16 15:19:12.578530 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.578498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03a7a5f-6e53-478d-974c-633703f78946-openshift-service-ca-bundle\") pod \"sequence-graph-cee76-7d9bc88ff6-j7t7v\" (UID: \"a03a7a5f-6e53-478d-974c-633703f78946\") " pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:12.578692 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.578624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a03a7a5f-6e53-478d-974c-633703f78946-proxy-tls\") pod \"sequence-graph-cee76-7d9bc88ff6-j7t7v\" (UID: \"a03a7a5f-6e53-478d-974c-633703f78946\") " pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:12.679220 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.679185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a03a7a5f-6e53-478d-974c-633703f78946-proxy-tls\") pod \"sequence-graph-cee76-7d9bc88ff6-j7t7v\" (UID: \"a03a7a5f-6e53-478d-974c-633703f78946\") " pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:12.679380 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.679232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03a7a5f-6e53-478d-974c-633703f78946-openshift-service-ca-bundle\") pod \"sequence-graph-cee76-7d9bc88ff6-j7t7v\" (UID: \"a03a7a5f-6e53-478d-974c-633703f78946\") " pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:12.679380 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:19:12.679322 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-cee76-serving-cert: secret "sequence-graph-cee76-serving-cert" not found Apr 16 15:19:12.679501 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:19:12.679395 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a03a7a5f-6e53-478d-974c-633703f78946-proxy-tls podName:a03a7a5f-6e53-478d-974c-633703f78946 nodeName:}" failed. No retries permitted until 2026-04-16 15:19:13.1793784 +0000 UTC m=+1604.097739583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a03a7a5f-6e53-478d-974c-633703f78946-proxy-tls") pod "sequence-graph-cee76-7d9bc88ff6-j7t7v" (UID: "a03a7a5f-6e53-478d-974c-633703f78946") : secret "sequence-graph-cee76-serving-cert" not found Apr 16 15:19:12.679931 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:12.679910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03a7a5f-6e53-478d-974c-633703f78946-openshift-service-ca-bundle\") pod \"sequence-graph-cee76-7d9bc88ff6-j7t7v\" (UID: \"a03a7a5f-6e53-478d-974c-633703f78946\") " pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:13.183401 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:13.183368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a03a7a5f-6e53-478d-974c-633703f78946-proxy-tls\") pod \"sequence-graph-cee76-7d9bc88ff6-j7t7v\" (UID: \"a03a7a5f-6e53-478d-974c-633703f78946\") " pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:13.185727 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:13.185696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a03a7a5f-6e53-478d-974c-633703f78946-proxy-tls\") pod \"sequence-graph-cee76-7d9bc88ff6-j7t7v\" (UID: \"a03a7a5f-6e53-478d-974c-633703f78946\") " pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:13.407102 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:13.407073 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:13.524365 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:13.524334 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v"] Apr 16 15:19:13.526714 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:19:13.526676 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda03a7a5f_6e53_478d_974c_633703f78946.slice/crio-4a460f88e4a989fb56878bd820c7e7c8cff696c0df983239bc523a88a40958b9 WatchSource:0}: Error finding container 4a460f88e4a989fb56878bd820c7e7c8cff696c0df983239bc523a88a40958b9: Status 404 returned error can't find the container with id 4a460f88e4a989fb56878bd820c7e7c8cff696c0df983239bc523a88a40958b9 Apr 16 15:19:14.215915 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:14.215874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" event={"ID":"a03a7a5f-6e53-478d-974c-633703f78946","Type":"ContainerStarted","Data":"b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2"} Apr 16 15:19:14.215915 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:14.215914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" event={"ID":"a03a7a5f-6e53-478d-974c-633703f78946","Type":"ContainerStarted","Data":"4a460f88e4a989fb56878bd820c7e7c8cff696c0df983239bc523a88a40958b9"} Apr 16 15:19:14.216157 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:14.215944 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:19:14.230661 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:14.230613 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" podStartSLOduration=2.2305979479999998 podStartE2EDuration="2.230597948s" podCreationTimestamp="2026-04-16 15:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:19:14.229102354 +0000 UTC m=+1605.147463558" watchObservedRunningTime="2026-04-16 15:19:14.230597948 +0000 UTC m=+1605.148959152" Apr 16 15:19:20.224211 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:19:20.224129 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:22:29.787045 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:22:29.787020 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:22:29.790018 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:22:29.789997 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:26:37.372921 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.372883 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb"] Apr 16 15:26:37.375037 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.373199 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" containerID="cri-o://2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e" gracePeriod=30 Apr 16 15:26:37.531586 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.531549 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj"] Apr 16 15:26:37.531935 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.531881 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" containerID="cri-o://ddf31bd24f8cc71b1d71396ba6a5e972361ccde049c02f48f27d622cc6ca52ff" gracePeriod=30 Apr 16 15:26:37.573159 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.573125 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7"] Apr 16 15:26:37.576181 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.576166 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" Apr 16 15:26:37.582139 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.582112 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7"] Apr 16 15:26:37.588037 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.588019 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" Apr 16 15:26:37.707609 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.707576 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7"] Apr 16 15:26:37.710492 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:26:37.710466 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod120e5528_f2aa_4e5f_ba13_209694829f54.slice/crio-a5c083f870e3d130c5242a2a76fcd8aaf58c7de1dbb36fd6e1139cbf5ef66f07 WatchSource:0}: Error finding container a5c083f870e3d130c5242a2a76fcd8aaf58c7de1dbb36fd6e1139cbf5ef66f07: Status 404 returned error can't find the container with id a5c083f870e3d130c5242a2a76fcd8aaf58c7de1dbb36fd6e1139cbf5ef66f07 Apr 16 15:26:37.712280 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.712263 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:26:37.715576 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:37.715548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" event={"ID":"120e5528-f2aa-4e5f-ba13-209694829f54","Type":"ContainerStarted","Data":"a5c083f870e3d130c5242a2a76fcd8aaf58c7de1dbb36fd6e1139cbf5ef66f07"} Apr 16 15:26:38.720680 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:38.720641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" event={"ID":"120e5528-f2aa-4e5f-ba13-209694829f54","Type":"ContainerStarted","Data":"6ad4f74991fa31b9a09bcb57b67af3cb5fc32622b6606fcfc793c7a43399b78b"} Apr 16 15:26:38.721167 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:38.720837 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" Apr 16 15:26:38.722057 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:38.722032 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:26:38.735547 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:38.735507 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" podStartSLOduration=1.735495655 podStartE2EDuration="1.735495655s" podCreationTimestamp="2026-04-16 15:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:26:38.73382413 +0000 UTC m=+2049.652185336" watchObservedRunningTime="2026-04-16 15:26:38.735495655 +0000 UTC m=+2049.653856862" Apr 16 15:26:39.724551 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:39.724516 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:26:40.045428 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:40.045347 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:26:40.728870 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:40.728838 2576 generic.go:358] "Generic (PLEG): container finished" podID="71002a64-03d0-4265-99f6-b078b62619c5" containerID="ddf31bd24f8cc71b1d71396ba6a5e972361ccde049c02f48f27d622cc6ca52ff" exitCode=0 Apr 16 15:26:40.729237 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:40.728889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" event={"ID":"71002a64-03d0-4265-99f6-b078b62619c5","Type":"ContainerDied","Data":"ddf31bd24f8cc71b1d71396ba6a5e972361ccde049c02f48f27d622cc6ca52ff"} Apr 16 15:26:40.969612 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:40.969585 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" Apr 16 15:26:41.733879 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:41.733845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" event={"ID":"71002a64-03d0-4265-99f6-b078b62619c5","Type":"ContainerDied","Data":"94fe02d111bb12a6c85e5b081d21264ea7a75550b8cae6c38e7738ed87aeaa55"} Apr 16 15:26:41.733879 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:41.733887 2576 scope.go:117] "RemoveContainer" containerID="ddf31bd24f8cc71b1d71396ba6a5e972361ccde049c02f48f27d622cc6ca52ff" Apr 16 15:26:41.734512 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:41.733914 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj" Apr 16 15:26:41.753052 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:41.753028 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj"] Apr 16 15:26:41.756741 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:41.756718 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2be2-predictor-5f8c869854-s6kpj"] Apr 16 15:26:41.806838 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:41.806807 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71002a64-03d0-4265-99f6-b078b62619c5" path="/var/lib/kubelet/pods/71002a64-03d0-4265-99f6-b078b62619c5/volumes" Apr 16 15:26:45.045348 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:45.045305 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:26:49.724997 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:49.724949 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:26:50.044735 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:50.044650 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:26:50.044890 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:50.044762 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:26:55.044706 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:55.044666 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:26:59.725552 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:26:59.725507 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:27:00.045527 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:00.045428 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:05.044592 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:05.044549 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:07.521541 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.521520 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:27:07.601055 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.601024 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be3951c5-9ed2-4522-be4f-8b9b4466e37b-proxy-tls\") pod \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\" (UID: \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\") " Apr 16 15:27:07.601220 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.601094 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3951c5-9ed2-4522-be4f-8b9b4466e37b-openshift-service-ca-bundle\") pod \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\" (UID: \"be3951c5-9ed2-4522-be4f-8b9b4466e37b\") " Apr 16 15:27:07.601439 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.601415 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3951c5-9ed2-4522-be4f-8b9b4466e37b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "be3951c5-9ed2-4522-be4f-8b9b4466e37b" (UID: "be3951c5-9ed2-4522-be4f-8b9b4466e37b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:27:07.603125 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.603100 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3951c5-9ed2-4522-be4f-8b9b4466e37b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "be3951c5-9ed2-4522-be4f-8b9b4466e37b" (UID: "be3951c5-9ed2-4522-be4f-8b9b4466e37b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:27:07.702166 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.702130 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be3951c5-9ed2-4522-be4f-8b9b4466e37b-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:27:07.702166 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.702163 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3951c5-9ed2-4522-be4f-8b9b4466e37b-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:27:07.829602 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.829567 2576 generic.go:358] "Generic (PLEG): container finished" podID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerID="2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e" exitCode=0 Apr 16 15:27:07.829742 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.829619 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" Apr 16 15:27:07.829742 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.829634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" event={"ID":"be3951c5-9ed2-4522-be4f-8b9b4466e37b","Type":"ContainerDied","Data":"2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e"} Apr 16 15:27:07.829742 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.829664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb" event={"ID":"be3951c5-9ed2-4522-be4f-8b9b4466e37b","Type":"ContainerDied","Data":"4d382bd2f45afcba52944ca3a5503ffa75fb7790937c6c8c8712a2ea63be44bb"} Apr 16 15:27:07.829742 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.829681 2576 scope.go:117] "RemoveContainer" containerID="2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e" Apr 16 15:27:07.837752 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.837731 2576 scope.go:117] "RemoveContainer" containerID="2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e" Apr 16 15:27:07.838057 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:27:07.838040 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e\": container with ID starting with 2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e not found: ID does not exist" containerID="2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e" Apr 16 15:27:07.838121 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.838069 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e"} err="failed to get container status \"2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e\": rpc error: code = NotFound desc = could not find container \"2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e\": container with ID starting with 2b35ab208fc4701cde3b1878ba79c2e77883b73f1362e9334a432654e903575e not found: ID does not exist" Apr 16 15:27:07.842499 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.842475 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb"] Apr 16 15:27:07.848325 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:07.848304 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b2be2-6df585b7fc-bqfmb"] Apr 16 15:27:09.725487 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:09.725447 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:27:09.806576 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:09.806543 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" path="/var/lib/kubelet/pods/be3951c5-9ed2-4522-be4f-8b9b4466e37b/volumes" Apr 16 15:27:19.725688 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:19.725647 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:27:27.189365 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.189329 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v"] Apr 16 15:27:27.189875 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.189633 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" containerID="cri-o://b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2" gracePeriod=30 Apr 16 15:27:27.362361 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.362317 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8"] Apr 16 15:27:27.362580 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.362559 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" containerID="cri-o://1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265" gracePeriod=30 Apr 16 15:27:27.400683 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.400654 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m"] Apr 16 15:27:27.401045 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.401031 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" Apr 16 15:27:27.401095 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.401046 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" Apr 16 15:27:27.401095 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.401067 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" Apr 16 15:27:27.401095 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.401073 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" Apr 16 15:27:27.401190 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.401133 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="be3951c5-9ed2-4522-be4f-8b9b4466e37b" containerName="ensemble-graph-b2be2" Apr 16 15:27:27.401190 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.401144 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="71002a64-03d0-4265-99f6-b078b62619c5" containerName="kserve-container" Apr 16 15:27:27.404041 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.404026 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" Apr 16 15:27:27.408406 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.408380 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m"] Apr 16 15:27:27.414694 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.414676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" Apr 16 15:27:27.531609 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.531587 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m"] Apr 16 15:27:27.534386 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:27:27.534358 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2960b44a_878f_4508_84be_d6eb2332aae4.slice/crio-e602f27cabd48d5f0aa571cf035c94ac39d265042c0f7c83a8012c4437d5e859 WatchSource:0}: Error finding container e602f27cabd48d5f0aa571cf035c94ac39d265042c0f7c83a8012c4437d5e859: Status 404 returned error can't find the container with id e602f27cabd48d5f0aa571cf035c94ac39d265042c0f7c83a8012c4437d5e859 Apr 16 15:27:27.904613 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.904579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" event={"ID":"2960b44a-878f-4508-84be-d6eb2332aae4","Type":"ContainerStarted","Data":"a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8"} Apr 16 15:27:27.904613 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.904616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" event={"ID":"2960b44a-878f-4508-84be-d6eb2332aae4","Type":"ContainerStarted","Data":"e602f27cabd48d5f0aa571cf035c94ac39d265042c0f7c83a8012c4437d5e859"} Apr 16 15:27:27.904849 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.904806 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" Apr 16 15:27:27.906167 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.906146 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:27:27.918282 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:27.918246 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" podStartSLOduration=0.91823406 podStartE2EDuration="918.23406ms" podCreationTimestamp="2026-04-16 15:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:27:27.916143496 +0000 UTC m=+2098.834504701" watchObservedRunningTime="2026-04-16 15:27:27.91823406 +0000 UTC m=+2098.836595345" Apr 16 15:27:28.908439 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:28.908402 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:27:29.726476 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:29.726450 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" Apr 16 15:27:29.814004 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:29.813978 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:27:29.816873 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:29.816854 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:27:30.222723 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.222678 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:30.401994 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.401974 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" Apr 16 15:27:30.916244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.916211 2576 generic.go:358] "Generic (PLEG): container finished" podID="edebcbcd-363d-4468-880a-45829d6ec699" containerID="1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265" exitCode=0 Apr 16 15:27:30.916428 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.916256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" event={"ID":"edebcbcd-363d-4468-880a-45829d6ec699","Type":"ContainerDied","Data":"1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265"} Apr 16 15:27:30.916428 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.916280 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" Apr 16 15:27:30.916428 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.916292 2576 scope.go:117] "RemoveContainer" containerID="1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265" Apr 16 15:27:30.916428 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.916281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8" event={"ID":"edebcbcd-363d-4468-880a-45829d6ec699","Type":"ContainerDied","Data":"5d4286f4b6f136f898a6d15f68ab4f43a6aea92d46c047f246772d5d94ba032c"} Apr 16 15:27:30.924480 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.924460 2576 scope.go:117] "RemoveContainer" containerID="1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265" Apr 16 15:27:30.924710 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:27:30.924692 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265\": container with ID starting with 1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265 not found: ID does not exist" containerID="1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265" Apr 16 15:27:30.924758 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.924718 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265"} err="failed to get container status \"1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265\": rpc error: code = NotFound desc = could not find container \"1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265\": container with ID starting with 1d05242fc92952b3669ca9fb3358de05b765c4482912b543f4c006e7f568b265 not found: ID does not exist" Apr 16 15:27:30.936790 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.936753 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8"] Apr 16 15:27:30.940238 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:30.940219 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cee76-predictor-866cc8b64-4mdx8"] Apr 16 15:27:31.805966 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:31.805938 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edebcbcd-363d-4468-880a-45829d6ec699" path="/var/lib/kubelet/pods/edebcbcd-363d-4468-880a-45829d6ec699/volumes" Apr 16 15:27:35.222865 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:35.222827 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:37.600049 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.600020 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v"] Apr 16 15:27:37.600448 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.600375 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" Apr 16 15:27:37.600448 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.600386 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" Apr 16 15:27:37.600448 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.600448 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="edebcbcd-363d-4468-880a-45829d6ec699" containerName="kserve-container" Apr 16 15:27:37.604862 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.604846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:37.606708 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.606685 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-1fc59-kube-rbac-proxy-sar-config\"" Apr 16 15:27:37.606854 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.606730 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-1fc59-serving-cert\"" Apr 16 15:27:37.611058 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.611035 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v"] Apr 16 15:27:37.652542 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.652505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-openshift-service-ca-bundle\") pod \"splitter-graph-1fc59-6bbd795f46-rvh7v\" (UID: \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\") " pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:37.652697 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.652547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-proxy-tls\") pod \"splitter-graph-1fc59-6bbd795f46-rvh7v\" (UID: \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\") " pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:37.753351 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.753322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-openshift-service-ca-bundle\") pod \"splitter-graph-1fc59-6bbd795f46-rvh7v\" (UID: \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\") " pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:37.753351 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.753354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-proxy-tls\") pod \"splitter-graph-1fc59-6bbd795f46-rvh7v\" (UID: \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\") " pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:37.754170 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.754144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-openshift-service-ca-bundle\") pod \"splitter-graph-1fc59-6bbd795f46-rvh7v\" (UID: \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\") " pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:37.755734 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.755712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-proxy-tls\") pod \"splitter-graph-1fc59-6bbd795f46-rvh7v\" (UID: \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\") " pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:37.914934 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:37.914903 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:38.035132 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:38.035104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v"] Apr 16 15:27:38.037099 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:27:38.037069 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e6bffa8_d6d5_4227_a6a1_66e580dc4f07.slice/crio-d8170a3eab1bd3763342c58aa93ae0abae5b14d1d36bb6032bc32994832629f6 WatchSource:0}: Error finding container d8170a3eab1bd3763342c58aa93ae0abae5b14d1d36bb6032bc32994832629f6: Status 404 returned error can't find the container with id d8170a3eab1bd3763342c58aa93ae0abae5b14d1d36bb6032bc32994832629f6 Apr 16 15:27:38.909474 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:38.909433 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:27:38.945194 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:38.945161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" event={"ID":"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07","Type":"ContainerStarted","Data":"cd8353aa670f34d3bbeb046ed771017758a8b446347cbeb621a9b48b40da8a4a"} Apr 16 15:27:38.945194 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:38.945198 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" event={"ID":"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07","Type":"ContainerStarted","Data":"d8170a3eab1bd3763342c58aa93ae0abae5b14d1d36bb6032bc32994832629f6"} Apr 16 15:27:38.945388 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:38.945229 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:38.960308 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:38.960258 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" podStartSLOduration=1.960244434 podStartE2EDuration="1.960244434s" podCreationTimestamp="2026-04-16 15:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:27:38.95813278 +0000 UTC m=+2109.876493984" watchObservedRunningTime="2026-04-16 15:27:38.960244434 +0000 UTC m=+2109.878605639" Apr 16 15:27:40.223299 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:40.223260 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:40.223683 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:40.223364 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:27:44.956503 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:44.956461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:45.223466 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:45.223376 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:47.675936 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:47.675897 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v"] Apr 16 15:27:47.676409 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:47.676119 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" containerID="cri-o://cd8353aa670f34d3bbeb046ed771017758a8b446347cbeb621a9b48b40da8a4a" gracePeriod=30 Apr 16 15:27:47.845442 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:47.845406 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7"] Apr 16 15:27:47.845694 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:47.845657 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" containerID="cri-o://6ad4f74991fa31b9a09bcb57b67af3cb5fc32622b6606fcfc793c7a43399b78b" gracePeriod=30 Apr 16 15:27:47.868717 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:47.868691 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r"] Apr 16 15:27:47.872253 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:47.872233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" Apr 16 15:27:47.878228 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:47.878203 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r"] Apr 16 15:27:47.883883 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:47.883867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" Apr 16 15:27:48.001416 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:48.001388 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r"] Apr 16 15:27:48.003887 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:27:48.003858 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d392b18_8ca4_4a19_a5d1_f781e0ef1cea.slice/crio-fab9727646f2fe850441fa4f6c63b8c8736715b340017b8f115e98ef84dc4b2e WatchSource:0}: Error finding container fab9727646f2fe850441fa4f6c63b8c8736715b340017b8f115e98ef84dc4b2e: Status 404 returned error can't find the container with id fab9727646f2fe850441fa4f6c63b8c8736715b340017b8f115e98ef84dc4b2e Apr 16 15:27:48.909235 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:48.909200 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:27:48.983708 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:48.983670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" event={"ID":"0d392b18-8ca4-4a19-a5d1-f781e0ef1cea","Type":"ContainerStarted","Data":"a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46"} Apr 16 15:27:48.983708 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:48.983704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" event={"ID":"0d392b18-8ca4-4a19-a5d1-f781e0ef1cea","Type":"ContainerStarted","Data":"fab9727646f2fe850441fa4f6c63b8c8736715b340017b8f115e98ef84dc4b2e"} Apr 16 15:27:48.983932 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:48.983893 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" Apr 16 15:27:48.984885 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:48.984859 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 15:27:48.998581 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:48.998542 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" podStartSLOduration=1.9985295920000001 podStartE2EDuration="1.998529592s" podCreationTimestamp="2026-04-16 15:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:27:48.996477129 +0000 UTC m=+2119.914838334" watchObservedRunningTime="2026-04-16 15:27:48.998529592 +0000 UTC m=+2119.916890858" Apr 16 15:27:49.724850 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:49.724814 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:27:49.953985 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:49.953947 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:49.987373 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:49.987301 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 15:27:50.222532 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:50.222495 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:50.992308 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:50.992272 2576 generic.go:358] "Generic (PLEG): container finished" podID="120e5528-f2aa-4e5f-ba13-209694829f54" containerID="6ad4f74991fa31b9a09bcb57b67af3cb5fc32622b6606fcfc793c7a43399b78b" exitCode=0 Apr 16 15:27:50.992656 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:50.992345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" event={"ID":"120e5528-f2aa-4e5f-ba13-209694829f54","Type":"ContainerDied","Data":"6ad4f74991fa31b9a09bcb57b67af3cb5fc32622b6606fcfc793c7a43399b78b"} Apr 16 15:27:51.091965 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:51.091944 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" Apr 16 15:27:51.996595 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:51.996559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" event={"ID":"120e5528-f2aa-4e5f-ba13-209694829f54","Type":"ContainerDied","Data":"a5c083f870e3d130c5242a2a76fcd8aaf58c7de1dbb36fd6e1139cbf5ef66f07"} Apr 16 15:27:51.996595 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:51.996591 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7" Apr 16 15:27:51.997061 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:51.996611 2576 scope.go:117] "RemoveContainer" containerID="6ad4f74991fa31b9a09bcb57b67af3cb5fc32622b6606fcfc793c7a43399b78b" Apr 16 15:27:52.010183 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:52.010159 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7"] Apr 16 15:27:52.013920 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:52.013899 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1fc59-predictor-856bfdcfb-97jv7"] Apr 16 15:27:53.805893 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:53.805864 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" path="/var/lib/kubelet/pods/120e5528-f2aa-4e5f-ba13-209694829f54/volumes" Apr 16 15:27:54.954450 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:54.954410 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:55.222799 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:55.222703 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:57.326016 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:57.325993 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:27:57.420755 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:57.420721 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a03a7a5f-6e53-478d-974c-633703f78946-proxy-tls\") pod \"a03a7a5f-6e53-478d-974c-633703f78946\" (UID: \"a03a7a5f-6e53-478d-974c-633703f78946\") " Apr 16 15:27:57.420943 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:57.420839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03a7a5f-6e53-478d-974c-633703f78946-openshift-service-ca-bundle\") pod \"a03a7a5f-6e53-478d-974c-633703f78946\" (UID: \"a03a7a5f-6e53-478d-974c-633703f78946\") " Apr 16 15:27:57.421186 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:57.421163 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a03a7a5f-6e53-478d-974c-633703f78946-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "a03a7a5f-6e53-478d-974c-633703f78946" (UID: "a03a7a5f-6e53-478d-974c-633703f78946"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:27:57.422763 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:57.422744 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03a7a5f-6e53-478d-974c-633703f78946-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a03a7a5f-6e53-478d-974c-633703f78946" (UID: "a03a7a5f-6e53-478d-974c-633703f78946"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:27:57.522414 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:57.522324 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03a7a5f-6e53-478d-974c-633703f78946-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:27:57.522414 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:57.522356 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a03a7a5f-6e53-478d-974c-633703f78946-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:27:58.020159 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.020123 2576 generic.go:358] "Generic (PLEG): container finished" podID="a03a7a5f-6e53-478d-974c-633703f78946" containerID="b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2" exitCode=0 Apr 16 15:27:58.020336 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.020191 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" Apr 16 15:27:58.020336 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.020207 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" event={"ID":"a03a7a5f-6e53-478d-974c-633703f78946","Type":"ContainerDied","Data":"b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2"} Apr 16 15:27:58.020336 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.020247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v" event={"ID":"a03a7a5f-6e53-478d-974c-633703f78946","Type":"ContainerDied","Data":"4a460f88e4a989fb56878bd820c7e7c8cff696c0df983239bc523a88a40958b9"} Apr 16 15:27:58.020336 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.020264 2576 scope.go:117] "RemoveContainer" containerID="b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2" Apr 16 15:27:58.028524 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.028506 2576 scope.go:117] "RemoveContainer" containerID="b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2" Apr 16 15:27:58.028737 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:27:58.028719 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2\": container with ID starting with b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2 not found: ID does not exist" containerID="b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2" Apr 16 15:27:58.028806 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.028745 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2"} err="failed to get container status \"b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2\": rpc error: code = NotFound desc = could not find container \"b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2\": container with ID starting with b2e18883b728885b9ca07a8879951b4c44748f6d97a21dd1e97d3a35ee0441e2 not found: ID does not exist" Apr 16 15:27:58.033959 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.033937 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v"] Apr 16 15:27:58.037028 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.037009 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cee76-7d9bc88ff6-j7t7v"] Apr 16 15:27:58.909000 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:58.908956 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:27:59.806652 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:59.806620 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a03a7a5f-6e53-478d-974c-633703f78946" path="/var/lib/kubelet/pods/a03a7a5f-6e53-478d-974c-633703f78946/volumes" Apr 16 15:27:59.953529 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:59.953490 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:27:59.953910 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:59.953612 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:27:59.987466 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:27:59.987437 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 15:28:04.953382 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:04.953336 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:28:08.909328 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:08.909288 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:28:09.954168 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:09.954133 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:28:09.987633 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:09.987599 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 15:28:14.953664 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:14.953628 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:28:18.089092 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.089003 2576 generic.go:358] "Generic (PLEG): container finished" podID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerID="cd8353aa670f34d3bbeb046ed771017758a8b446347cbeb621a9b48b40da8a4a" exitCode=0 Apr 16 15:28:18.089092 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.089073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" event={"ID":"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07","Type":"ContainerDied","Data":"cd8353aa670f34d3bbeb046ed771017758a8b446347cbeb621a9b48b40da8a4a"} Apr 16 15:28:18.317916 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.317892 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:28:18.404552 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.404520 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-proxy-tls\") pod \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\" (UID: \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\") " Apr 16 15:28:18.404717 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.404637 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-openshift-service-ca-bundle\") pod \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\" (UID: \"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07\") " Apr 16 15:28:18.405047 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.405022 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" (UID: "7e6bffa8-d6d5-4227-a6a1-66e580dc4f07"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:28:18.406645 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.406622 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" (UID: "7e6bffa8-d6d5-4227-a6a1-66e580dc4f07"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:28:18.505876 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.505834 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:28:18.505876 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.505870 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:28:18.909993 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:18.909959 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" Apr 16 15:28:19.093357 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:19.093321 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" event={"ID":"7e6bffa8-d6d5-4227-a6a1-66e580dc4f07","Type":"ContainerDied","Data":"d8170a3eab1bd3763342c58aa93ae0abae5b14d1d36bb6032bc32994832629f6"} Apr 16 15:28:19.093887 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:19.093360 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v" Apr 16 15:28:19.093887 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:19.093365 2576 scope.go:117] "RemoveContainer" containerID="cd8353aa670f34d3bbeb046ed771017758a8b446347cbeb621a9b48b40da8a4a" Apr 16 15:28:19.113097 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:19.113075 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v"] Apr 16 15:28:19.118092 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:19.118068 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1fc59-6bbd795f46-rvh7v"] Apr 16 15:28:19.806787 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:19.806749 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" path="/var/lib/kubelet/pods/7e6bffa8-d6d5-4227-a6a1-66e580dc4f07/volumes" Apr 16 15:28:19.988417 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:19.988376 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 15:28:29.988184 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:29.988148 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 15:28:37.410453 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410420 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5"] Apr 16 15:28:37.410821 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410756 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" Apr 16 15:28:37.410821 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410782 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" Apr 16 15:28:37.410821 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410795 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" Apr 16 15:28:37.410821 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410801 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" Apr 16 15:28:37.410821 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410811 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" Apr 16 15:28:37.410821 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410816 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" Apr 16 15:28:37.411025 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410872 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a03a7a5f-6e53-478d-974c-633703f78946" containerName="sequence-graph-cee76" Apr 16 15:28:37.411025 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410881 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e6bffa8-d6d5-4227-a6a1-66e580dc4f07" containerName="splitter-graph-1fc59" Apr 16 15:28:37.411025 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.410890 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="120e5528-f2aa-4e5f-ba13-209694829f54" containerName="kserve-container" Apr 16 15:28:37.414942 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.414926 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:37.417044 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.417021 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:28:37.417248 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.417042 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9c7d6-serving-cert\"" Apr 16 15:28:37.417248 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.417021 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9c7d6-kube-rbac-proxy-sar-config\"" Apr 16 15:28:37.418522 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.418496 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5"] Apr 16 15:28:37.462754 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.462727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6042503-703b-439f-b2a6-fd543b8708e9-proxy-tls\") pod \"switch-graph-9c7d6-58455857bf-xj6x5\" (UID: \"c6042503-703b-439f-b2a6-fd543b8708e9\") " pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:37.462887 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.462842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6042503-703b-439f-b2a6-fd543b8708e9-openshift-service-ca-bundle\") pod \"switch-graph-9c7d6-58455857bf-xj6x5\" (UID: \"c6042503-703b-439f-b2a6-fd543b8708e9\") " pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:37.563752 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.563724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6042503-703b-439f-b2a6-fd543b8708e9-openshift-service-ca-bundle\") pod \"switch-graph-9c7d6-58455857bf-xj6x5\" (UID: \"c6042503-703b-439f-b2a6-fd543b8708e9\") " pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:37.563914 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.563892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6042503-703b-439f-b2a6-fd543b8708e9-proxy-tls\") pod \"switch-graph-9c7d6-58455857bf-xj6x5\" (UID: \"c6042503-703b-439f-b2a6-fd543b8708e9\") " pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:37.564012 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:28:37.563999 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-9c7d6-serving-cert: secret "switch-graph-9c7d6-serving-cert" not found Apr 16 15:28:37.564063 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:28:37.564054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6042503-703b-439f-b2a6-fd543b8708e9-proxy-tls podName:c6042503-703b-439f-b2a6-fd543b8708e9 nodeName:}" failed. No retries permitted until 2026-04-16 15:28:38.064038175 +0000 UTC m=+2168.982399358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c6042503-703b-439f-b2a6-fd543b8708e9-proxy-tls") pod "switch-graph-9c7d6-58455857bf-xj6x5" (UID: "c6042503-703b-439f-b2a6-fd543b8708e9") : secret "switch-graph-9c7d6-serving-cert" not found Apr 16 15:28:37.564356 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:37.564338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6042503-703b-439f-b2a6-fd543b8708e9-openshift-service-ca-bundle\") pod \"switch-graph-9c7d6-58455857bf-xj6x5\" (UID: \"c6042503-703b-439f-b2a6-fd543b8708e9\") " pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:38.068963 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:38.068932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6042503-703b-439f-b2a6-fd543b8708e9-proxy-tls\") pod \"switch-graph-9c7d6-58455857bf-xj6x5\" (UID: \"c6042503-703b-439f-b2a6-fd543b8708e9\") " pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:38.071405 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:38.071373 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6042503-703b-439f-b2a6-fd543b8708e9-proxy-tls\") pod \"switch-graph-9c7d6-58455857bf-xj6x5\" (UID: \"c6042503-703b-439f-b2a6-fd543b8708e9\") " pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:38.326450 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:38.326366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:38.444076 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:38.444052 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5"] Apr 16 15:28:38.446629 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:28:38.446602 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6042503_703b_439f_b2a6_fd543b8708e9.slice/crio-e83c4df54753b3145d1f7c0c8bf8256f6380abf1fcd4ca6c255511d7bd80dcb4 WatchSource:0}: Error finding container e83c4df54753b3145d1f7c0c8bf8256f6380abf1fcd4ca6c255511d7bd80dcb4: Status 404 returned error can't find the container with id e83c4df54753b3145d1f7c0c8bf8256f6380abf1fcd4ca6c255511d7bd80dcb4 Apr 16 15:28:39.163455 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:39.163423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" event={"ID":"c6042503-703b-439f-b2a6-fd543b8708e9","Type":"ContainerStarted","Data":"c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d"} Apr 16 15:28:39.163455 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:39.163458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" event={"ID":"c6042503-703b-439f-b2a6-fd543b8708e9","Type":"ContainerStarted","Data":"e83c4df54753b3145d1f7c0c8bf8256f6380abf1fcd4ca6c255511d7bd80dcb4"} Apr 16 15:28:39.163690 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:39.163515 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:39.178572 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:39.178525 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" podStartSLOduration=2.178511004 podStartE2EDuration="2.178511004s" podCreationTimestamp="2026-04-16 15:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:28:39.176444798 +0000 UTC m=+2170.094806003" watchObservedRunningTime="2026-04-16 15:28:39.178511004 +0000 UTC m=+2170.096872209" Apr 16 15:28:39.988609 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:39.988580 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" Apr 16 15:28:45.171636 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:45.171604 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:28:47.885323 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:47.885291 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw"] Apr 16 15:28:47.888730 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:47.888714 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:47.890570 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:47.890543 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-41c6b-kube-rbac-proxy-sar-config\"" Apr 16 15:28:47.890689 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:47.890633 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-41c6b-serving-cert\"" Apr 16 15:28:47.894852 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:47.894827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw"] Apr 16 15:28:47.947123 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:47.947098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd73551-2651-462c-8a61-2f602c2823fe-proxy-tls\") pod \"splitter-graph-41c6b-55c4b9fc75-9pkkw\" (UID: \"2cd73551-2651-462c-8a61-2f602c2823fe\") " pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:47.947266 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:47.947161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cd73551-2651-462c-8a61-2f602c2823fe-openshift-service-ca-bundle\") pod \"splitter-graph-41c6b-55c4b9fc75-9pkkw\" (UID: \"2cd73551-2651-462c-8a61-2f602c2823fe\") " pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:48.047713 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:48.047686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cd73551-2651-462c-8a61-2f602c2823fe-openshift-service-ca-bundle\") pod \"splitter-graph-41c6b-55c4b9fc75-9pkkw\" (UID: \"2cd73551-2651-462c-8a61-2f602c2823fe\") " pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:48.047869 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:48.047750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd73551-2651-462c-8a61-2f602c2823fe-proxy-tls\") pod \"splitter-graph-41c6b-55c4b9fc75-9pkkw\" (UID: \"2cd73551-2651-462c-8a61-2f602c2823fe\") " pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:48.047916 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:28:48.047894 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-41c6b-serving-cert: secret "splitter-graph-41c6b-serving-cert" not found Apr 16 15:28:48.047959 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:28:48.047950 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd73551-2651-462c-8a61-2f602c2823fe-proxy-tls podName:2cd73551-2651-462c-8a61-2f602c2823fe nodeName:}" failed. No retries permitted until 2026-04-16 15:28:48.54793371 +0000 UTC m=+2179.466294893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2cd73551-2651-462c-8a61-2f602c2823fe-proxy-tls") pod "splitter-graph-41c6b-55c4b9fc75-9pkkw" (UID: "2cd73551-2651-462c-8a61-2f602c2823fe") : secret "splitter-graph-41c6b-serving-cert" not found Apr 16 15:28:48.048332 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:48.048316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cd73551-2651-462c-8a61-2f602c2823fe-openshift-service-ca-bundle\") pod \"splitter-graph-41c6b-55c4b9fc75-9pkkw\" (UID: \"2cd73551-2651-462c-8a61-2f602c2823fe\") " pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:48.552824 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:48.552783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd73551-2651-462c-8a61-2f602c2823fe-proxy-tls\") pod \"splitter-graph-41c6b-55c4b9fc75-9pkkw\" (UID: \"2cd73551-2651-462c-8a61-2f602c2823fe\") " pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:48.555133 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:48.555104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd73551-2651-462c-8a61-2f602c2823fe-proxy-tls\") pod \"splitter-graph-41c6b-55c4b9fc75-9pkkw\" (UID: \"2cd73551-2651-462c-8a61-2f602c2823fe\") " pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:48.800247 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:48.800215 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:48.920012 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:48.919986 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw"] Apr 16 15:28:48.922446 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:28:48.922418 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd73551_2651_462c_8a61_2f602c2823fe.slice/crio-75e288936653e5f007c3e2b0173a14ab118e88182ef1ae016bd0f82615b43cf0 WatchSource:0}: Error finding container 75e288936653e5f007c3e2b0173a14ab118e88182ef1ae016bd0f82615b43cf0: Status 404 returned error can't find the container with id 75e288936653e5f007c3e2b0173a14ab118e88182ef1ae016bd0f82615b43cf0 Apr 16 15:28:49.198428 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:49.198396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" event={"ID":"2cd73551-2651-462c-8a61-2f602c2823fe","Type":"ContainerStarted","Data":"2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918"} Apr 16 15:28:49.198616 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:49.198434 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" event={"ID":"2cd73551-2651-462c-8a61-2f602c2823fe","Type":"ContainerStarted","Data":"75e288936653e5f007c3e2b0173a14ab118e88182ef1ae016bd0f82615b43cf0"} Apr 16 15:28:49.198616 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:49.198526 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:28:49.213298 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:49.213260 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" podStartSLOduration=2.213248063 podStartE2EDuration="2.213248063s" podCreationTimestamp="2026-04-16 15:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:28:49.212680671 +0000 UTC m=+2180.131041902" watchObservedRunningTime="2026-04-16 15:28:49.213248063 +0000 UTC m=+2180.131609267" Apr 16 15:28:55.206453 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:28:55.206425 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:32:29.837166 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:32:29.837136 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:32:29.841306 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:32:29.841284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:37:02.636017 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:02.635984 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw"] Apr 16 15:37:02.636560 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:02.636205 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" containerID="cri-o://2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918" gracePeriod=30 Apr 16 15:37:02.745468 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:02.745426 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r"] Apr 16 15:37:02.745733 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:02.745689 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" containerID="cri-o://a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46" gracePeriod=30 Apr 16 15:37:05.204522 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.204480 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:37:05.789215 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.789193 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" Apr 16 15:37:05.900378 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.900343 2576 generic.go:358] "Generic (PLEG): container finished" podID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerID="a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46" exitCode=0 Apr 16 15:37:05.900575 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.900395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" event={"ID":"0d392b18-8ca4-4a19-a5d1-f781e0ef1cea","Type":"ContainerDied","Data":"a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46"} Apr 16 15:37:05.900575 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.900404 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" Apr 16 15:37:05.900575 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.900423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r" event={"ID":"0d392b18-8ca4-4a19-a5d1-f781e0ef1cea","Type":"ContainerDied","Data":"fab9727646f2fe850441fa4f6c63b8c8736715b340017b8f115e98ef84dc4b2e"} Apr 16 15:37:05.900575 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.900445 2576 scope.go:117] "RemoveContainer" containerID="a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46" Apr 16 15:37:05.908690 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.908671 2576 scope.go:117] "RemoveContainer" containerID="a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46" Apr 16 15:37:05.908990 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:37:05.908966 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46\": container with ID starting with a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46 not found: ID does not exist" containerID="a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46" Apr 16 15:37:05.909053 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.909000 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46"} err="failed to get container status \"a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46\": rpc error: code = NotFound desc = could not find container \"a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46\": container with ID starting with a490b2239e568db8551c1af0d4bc589b97e6119d6f97a600d15dcfde79d2eb46 not found: ID does not exist" Apr 16 15:37:05.913710 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.913689 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r"] Apr 16 15:37:05.915452 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:05.915432 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41c6b-predictor-687c7b4cb4-w7l5r"] Apr 16 15:37:07.806489 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:07.806456 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" path="/var/lib/kubelet/pods/0d392b18-8ca4-4a19-a5d1-f781e0ef1cea/volumes" Apr 16 15:37:10.205138 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:10.205104 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:37:15.205111 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:15.205032 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:37:15.205464 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:15.205143 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:37:20.204549 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:20.204506 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:37:25.208836 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:25.205362 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:37:29.861226 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:29.861199 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:37:29.872946 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:29.872907 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:37:30.205000 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:30.204963 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:37:32.784402 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.784378 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:37:32.865130 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.865093 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd73551-2651-462c-8a61-2f602c2823fe-proxy-tls\") pod \"2cd73551-2651-462c-8a61-2f602c2823fe\" (UID: \"2cd73551-2651-462c-8a61-2f602c2823fe\") " Apr 16 15:37:32.865329 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.865143 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cd73551-2651-462c-8a61-2f602c2823fe-openshift-service-ca-bundle\") pod \"2cd73551-2651-462c-8a61-2f602c2823fe\" (UID: \"2cd73551-2651-462c-8a61-2f602c2823fe\") " Apr 16 15:37:32.865516 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.865491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd73551-2651-462c-8a61-2f602c2823fe-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2cd73551-2651-462c-8a61-2f602c2823fe" (UID: "2cd73551-2651-462c-8a61-2f602c2823fe"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:37:32.867143 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.867121 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd73551-2651-462c-8a61-2f602c2823fe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2cd73551-2651-462c-8a61-2f602c2823fe" (UID: "2cd73551-2651-462c-8a61-2f602c2823fe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:37:32.966458 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.966420 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd73551-2651-462c-8a61-2f602c2823fe-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:37:32.966458 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.966453 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cd73551-2651-462c-8a61-2f602c2823fe-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:37:32.995268 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.995231 2576 generic.go:358] "Generic (PLEG): container finished" podID="2cd73551-2651-462c-8a61-2f602c2823fe" containerID="2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918" exitCode=0 Apr 16 15:37:32.995435 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.995281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" event={"ID":"2cd73551-2651-462c-8a61-2f602c2823fe","Type":"ContainerDied","Data":"2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918"} Apr 16 15:37:32.995435 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.995302 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" Apr 16 15:37:32.995435 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.995313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw" event={"ID":"2cd73551-2651-462c-8a61-2f602c2823fe","Type":"ContainerDied","Data":"75e288936653e5f007c3e2b0173a14ab118e88182ef1ae016bd0f82615b43cf0"} Apr 16 15:37:32.995435 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:32.995327 2576 scope.go:117] "RemoveContainer" containerID="2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918" Apr 16 15:37:33.003852 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:33.003831 2576 scope.go:117] "RemoveContainer" containerID="2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918" Apr 16 15:37:33.004116 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:37:33.004098 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918\": container with ID starting with 2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918 not found: ID does not exist" containerID="2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918" Apr 16 15:37:33.004182 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:33.004123 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918"} err="failed to get container status \"2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918\": rpc error: code = NotFound desc = could not find container \"2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918\": container with ID starting with 2373cbb4fa4867489b9def58f84f44c5d49ce9ef67e023f38e5504cdc18c8918 not found: ID does not exist" Apr 16 15:37:33.014701 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:33.014676 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw"] Apr 16 15:37:33.017584 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:33.017562 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41c6b-55c4b9fc75-9pkkw"] Apr 16 15:37:33.806057 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:37:33.806027 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" path="/var/lib/kubelet/pods/2cd73551-2651-462c-8a61-2f602c2823fe/volumes" Apr 16 15:42:29.888298 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:42:29.888265 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:42:29.901480 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:42:29.901456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:44:56.780345 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:44:56.780273 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5"] Apr 16 15:44:56.780884 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:44:56.780505 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" containerID="cri-o://c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d" gracePeriod=30 Apr 16 15:44:56.958613 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:44:56.958580 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m"] Apr 16 15:44:56.958879 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:44:56.958857 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" containerID="cri-o://a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8" gracePeriod=30 Apr 16 15:44:58.909112 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:44:58.909066 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:45:00.008249 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.008217 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" Apr 16 15:45:00.170196 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.170161 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:45:00.508634 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.508546 2576 generic.go:358] "Generic (PLEG): container finished" podID="2960b44a-878f-4508-84be-d6eb2332aae4" containerID="a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8" exitCode=0 Apr 16 15:45:00.508634 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.508605 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" Apr 16 15:45:00.508899 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.508638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" event={"ID":"2960b44a-878f-4508-84be-d6eb2332aae4","Type":"ContainerDied","Data":"a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8"} Apr 16 15:45:00.508899 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.508685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m" event={"ID":"2960b44a-878f-4508-84be-d6eb2332aae4","Type":"ContainerDied","Data":"e602f27cabd48d5f0aa571cf035c94ac39d265042c0f7c83a8012c4437d5e859"} Apr 16 15:45:00.508899 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.508708 2576 scope.go:117] "RemoveContainer" containerID="a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8" Apr 16 15:45:00.517588 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.517566 2576 scope.go:117] "RemoveContainer" containerID="a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8" Apr 16 15:45:00.517878 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:45:00.517854 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8\": container with ID starting with a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8 not found: ID does not exist" containerID="a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8" Apr 16 15:45:00.517958 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.517889 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8"} err="failed to get container status \"a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8\": rpc error: code = NotFound desc = could not find container \"a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8\": container with ID starting with a79d9d36a1351eb8f11d5153efd6ba6a2ad9bea71e9ed60b583ca2ec1b0d75f8 not found: ID does not exist" Apr 16 15:45:00.527945 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.527918 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m"] Apr 16 15:45:00.529407 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:00.529387 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c7d6-predictor-5fb75bc8f4-f4r9m"] Apr 16 15:45:01.807018 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:01.806988 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" path="/var/lib/kubelet/pods/2960b44a-878f-4508-84be-d6eb2332aae4/volumes" Apr 16 15:45:05.170629 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:05.170589 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:45:10.170499 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:10.170456 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:45:10.170915 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:10.170563 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:45:12.325467 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:12.325436 2576 ???:1] "http: TLS handshake error from 10.0.130.229:41630: EOF" Apr 16 15:45:12.328322 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:12.328297 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:13.108170 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:13.108136 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:13.851250 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:13.851178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:14.587223 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:14.587188 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:15.170190 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:15.170150 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:45:15.320502 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:15.320476 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:16.050353 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:16.050325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:16.790079 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:16.790049 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:17.542787 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:17.542747 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:18.278478 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:18.278448 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:19.006876 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:19.006842 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:19.749548 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:19.749516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:20.170502 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:20.170465 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:45:20.539038 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:20.538945 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-9c7d6-58455857bf-xj6x5_c6042503-703b-439f-b2a6-fd543b8708e9/switch-graph-9c7d6/0.log" Apr 16 15:45:25.170547 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:25.170507 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:45:25.676454 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:25.676426 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x59c6_321149cb-c540-42e4-89ca-d79124957dca/global-pull-secret-syncer/0.log" Apr 16 15:45:25.718527 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:25.718489 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5nclj_3323004a-60ab-45da-8ce1-47a7a8622df4/konnectivity-agent/0.log" Apr 16 15:45:25.824842 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:25.824810 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-140.ec2.internal_8e5adc7c4d5729c56cf31736cd00ffbf/haproxy/0.log" Apr 16 15:45:26.925542 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:26.925517 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:45:26.986011 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:26.985973 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6042503-703b-439f-b2a6-fd543b8708e9-openshift-service-ca-bundle\") pod \"c6042503-703b-439f-b2a6-fd543b8708e9\" (UID: \"c6042503-703b-439f-b2a6-fd543b8708e9\") " Apr 16 15:45:26.986172 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:26.986025 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6042503-703b-439f-b2a6-fd543b8708e9-proxy-tls\") pod \"c6042503-703b-439f-b2a6-fd543b8708e9\" (UID: \"c6042503-703b-439f-b2a6-fd543b8708e9\") " Apr 16 15:45:26.986335 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:26.986312 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6042503-703b-439f-b2a6-fd543b8708e9-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c6042503-703b-439f-b2a6-fd543b8708e9" (UID: "c6042503-703b-439f-b2a6-fd543b8708e9"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:45:26.988159 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:26.988138 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6042503-703b-439f-b2a6-fd543b8708e9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c6042503-703b-439f-b2a6-fd543b8708e9" (UID: "c6042503-703b-439f-b2a6-fd543b8708e9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:45:27.087318 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.087221 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6042503-703b-439f-b2a6-fd543b8708e9-openshift-service-ca-bundle\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:45:27.087318 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.087255 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6042503-703b-439f-b2a6-fd543b8708e9-proxy-tls\") on node \"ip-10-0-130-140.ec2.internal\" DevicePath \"\"" Apr 16 15:45:27.606636 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.606602 2576 generic.go:358] "Generic (PLEG): container finished" podID="c6042503-703b-439f-b2a6-fd543b8708e9" containerID="c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d" exitCode=0 Apr 16 15:45:27.606984 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.606676 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" Apr 16 15:45:27.606984 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.606688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" event={"ID":"c6042503-703b-439f-b2a6-fd543b8708e9","Type":"ContainerDied","Data":"c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d"} Apr 16 15:45:27.606984 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.606724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5" event={"ID":"c6042503-703b-439f-b2a6-fd543b8708e9","Type":"ContainerDied","Data":"e83c4df54753b3145d1f7c0c8bf8256f6380abf1fcd4ca6c255511d7bd80dcb4"} Apr 16 15:45:27.606984 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.606740 2576 scope.go:117] "RemoveContainer" containerID="c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d" Apr 16 15:45:27.615377 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.615361 2576 scope.go:117] "RemoveContainer" containerID="c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d" Apr 16 15:45:27.615651 ip-10-0-130-140 kubenswrapper[2576]: E0416 15:45:27.615628 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d\": container with ID starting with c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d not found: ID does not exist" containerID="c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d" Apr 16 15:45:27.615729 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.615663 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d"} err="failed to get container status \"c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d\": rpc error: code = NotFound desc = could not find container \"c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d\": container with ID starting with c9a3088dbc2fc66379687e69b7d6f79cda9d875003a12347be9ead8eecbe8d5d not found: ID does not exist" Apr 16 15:45:27.626806 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.626758 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5"] Apr 16 15:45:27.628285 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.628263 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9c7d6-58455857bf-xj6x5"] Apr 16 15:45:27.806740 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:27.806704 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" path="/var/lib/kubelet/pods/c6042503-703b-439f-b2a6-fd543b8708e9/volumes" Apr 16 15:45:29.231641 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.231607 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-wddbs_f77a2c7c-9c50-400a-8982-b3e524240d5f/cluster-monitoring-operator/0.log" Apr 16 15:45:29.471547 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.471519 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p2rsl_14b82f4f-8a65-45b3-a4df-1eb8eecf50f5/node-exporter/0.log" Apr 16 15:45:29.492465 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.492388 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p2rsl_14b82f4f-8a65-45b3-a4df-1eb8eecf50f5/kube-rbac-proxy/0.log" Apr 16 15:45:29.512374 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.512349 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p2rsl_14b82f4f-8a65-45b3-a4df-1eb8eecf50f5/init-textfile/0.log" Apr 16 15:45:29.847565 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.847490 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-v7zjv_55a6969b-5147-482a-879e-2d7c3ed30812/prometheus-operator/0.log" Apr 16 15:45:29.864188 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.864167 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-v7zjv_55a6969b-5147-482a-879e-2d7c3ed30812/kube-rbac-proxy/0.log" Apr 16 15:45:29.920624 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.920596 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-788bffd54-zmqgb_63e43d42-aaea-4a5d-a411-ab6d342b83a4/telemeter-client/0.log" Apr 16 15:45:29.940065 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.940043 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-788bffd54-zmqgb_63e43d42-aaea-4a5d-a411-ab6d342b83a4/reload/0.log" Apr 16 15:45:29.961465 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.961432 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-788bffd54-zmqgb_63e43d42-aaea-4a5d-a411-ab6d342b83a4/kube-rbac-proxy/0.log" Apr 16 15:45:29.987937 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:29.987915 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9c4d76b-8r88q_ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5/thanos-query/0.log" Apr 16 15:45:30.008084 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:30.008057 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9c4d76b-8r88q_ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5/kube-rbac-proxy-web/0.log" Apr 16 15:45:30.029865 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:30.029833 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9c4d76b-8r88q_ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5/kube-rbac-proxy/0.log" Apr 16 15:45:30.050650 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:30.050626 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9c4d76b-8r88q_ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5/prom-label-proxy/0.log" Apr 16 15:45:30.070823 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:30.070803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9c4d76b-8r88q_ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5/kube-rbac-proxy-rules/0.log" Apr 16 15:45:30.094911 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:30.094889 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9c4d76b-8r88q_ce46f14a-8173-42ee-8a1b-3ee7d2abf2f5/kube-rbac-proxy-metrics/0.log" Apr 16 15:45:31.954870 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:31.954844 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-688c748945-2xq7q_3ba90034-840a-4eda-afee-61e3998fa8d1/console/0.log" Apr 16 15:45:32.011675 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.011643 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-kpbxh_3c7ac47a-551c-4399-8338-c3942554bedd/download-server/0.log" Apr 16 15:45:32.287425 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.286998 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5"] Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288308 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288334 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288351 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288368 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288390 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288405 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288441 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288450 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288606 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cd73551-2651-462c-8a61-2f602c2823fe" containerName="splitter-graph-41c6b" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288620 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2960b44a-878f-4508-84be-d6eb2332aae4" containerName="kserve-container" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288640 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d392b18-8ca4-4a19-a5d1-f781e0ef1cea" containerName="kserve-container" Apr 16 15:45:32.290244 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.288652 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6042503-703b-439f-b2a6-fd543b8708e9" containerName="switch-graph-9c7d6" Apr 16 15:45:32.294703 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.294674 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5"] Apr 16 15:45:32.294854 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.294830 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.296931 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.296905 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ndkd\"/\"kube-root-ca.crt\"" Apr 16 15:45:32.297566 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.297540 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2ndkd\"/\"default-dockercfg-jmzsl\"" Apr 16 15:45:32.297699 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.297579 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ndkd\"/\"openshift-service-ca.crt\"" Apr 16 15:45:32.326184 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.326158 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-proc\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.326320 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.326200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-lib-modules\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.326320 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.326219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwzhf\" (UniqueName: \"kubernetes.io/projected/1567b0ce-ccdb-4e27-b016-ac20f184ace2-kube-api-access-wwzhf\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.326320 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.326238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-sys\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.326320 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.326290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-podres\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.405491 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.405457 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-t267v_7ce1ceef-1ec7-4e59-a113-546b447470ba/volume-data-source-validator/0.log" Apr 16 15:45:32.427646 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.427615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-lib-modules\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.427646 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.427647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwzhf\" (UniqueName: \"kubernetes.io/projected/1567b0ce-ccdb-4e27-b016-ac20f184ace2-kube-api-access-wwzhf\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.427904 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.427665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-sys\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.427904 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.427693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-podres\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.427904 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.427730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-proc\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.427904 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.427761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-sys\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.427904 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.427803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-lib-modules\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.427904 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.427815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-proc\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.427904 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.427867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1567b0ce-ccdb-4e27-b016-ac20f184ace2-podres\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.434776 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.434741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwzhf\" (UniqueName: \"kubernetes.io/projected/1567b0ce-ccdb-4e27-b016-ac20f184ace2-kube-api-access-wwzhf\") pod \"perf-node-gather-daemonset-frxh5\" (UID: \"1567b0ce-ccdb-4e27-b016-ac20f184ace2\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.606360 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.606269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:32.727286 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.727237 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5"] Apr 16 15:45:32.730161 ip-10-0-130-140 kubenswrapper[2576]: W0416 15:45:32.730018 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1567b0ce_ccdb_4e27_b016_ac20f184ace2.slice/crio-56cdab52d5687f50eefed62efcd23be23914680ee6abb43a9b7271e33bb7b875 WatchSource:0}: Error finding container 56cdab52d5687f50eefed62efcd23be23914680ee6abb43a9b7271e33bb7b875: Status 404 returned error can't find the container with id 56cdab52d5687f50eefed62efcd23be23914680ee6abb43a9b7271e33bb7b875 Apr 16 15:45:32.731947 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:32.731932 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:45:33.106654 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:33.106627 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwb2b_7ede1baa-a6e7-4b5e-8723-94a6c70847e3/dns/0.log" Apr 16 15:45:33.126981 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:33.126911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwb2b_7ede1baa-a6e7-4b5e-8723-94a6c70847e3/kube-rbac-proxy/0.log" Apr 16 15:45:33.191910 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:33.191881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s9crx_3cebcaa8-957d-4f1e-b4f8-90637dae2bc0/dns-node-resolver/0.log" Apr 16 15:45:33.629281 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:33.629241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" event={"ID":"1567b0ce-ccdb-4e27-b016-ac20f184ace2","Type":"ContainerStarted","Data":"1028d3eedfdcb046cb1b1c29c6546662f422547bdec3feefa3926ffa93d68a17"} Apr 16 15:45:33.629281 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:33.629278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" event={"ID":"1567b0ce-ccdb-4e27-b016-ac20f184ace2","Type":"ContainerStarted","Data":"56cdab52d5687f50eefed62efcd23be23914680ee6abb43a9b7271e33bb7b875"} Apr 16 15:45:33.629494 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:33.629365 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:33.637963 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:33.637936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tsg7q_e27e48da-a6dc-4e84-87f4-01916a11e065/node-ca/0.log" Apr 16 15:45:33.645125 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:33.645079 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" podStartSLOduration=1.6450640509999999 podStartE2EDuration="1.645064051s" podCreationTimestamp="2026-04-16 15:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:45:33.642757584 +0000 UTC m=+3184.561118799" watchObservedRunningTime="2026-04-16 15:45:33.645064051 +0000 UTC m=+3184.563425256" Apr 16 15:45:34.597514 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:34.597485 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-n8878_fb790f7c-1dc9-4bf8-a9e6-1054b49e346c/serve-healthcheck-canary/0.log" Apr 16 15:45:34.918750 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:34.918714 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-g8xfd_2201f8ec-763d-4bde-9b0c-b412c0a2c025/insights-operator/0.log" Apr 16 15:45:34.920094 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:34.920071 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-g8xfd_2201f8ec-763d-4bde-9b0c-b412c0a2c025/insights-operator/1.log" Apr 16 15:45:35.001474 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:35.001441 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nbr9m_47d8c674-a5dd-4235-aecc-2923a5a0809c/kube-rbac-proxy/0.log" Apr 16 15:45:35.021725 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:35.021700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nbr9m_47d8c674-a5dd-4235-aecc-2923a5a0809c/exporter/0.log" Apr 16 15:45:35.043209 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:35.043180 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nbr9m_47d8c674-a5dd-4235-aecc-2923a5a0809c/extractor/0.log" Apr 16 15:45:36.897983 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:36.897936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-xj6kl_4607d733-d1b9-418f-9925-2ff595da859c/manager/0.log" Apr 16 15:45:37.230679 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:37.230648 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-hnwm2_3b3ec15b-9b86-4feb-876d-51e5a458081d/seaweedfs/0.log" Apr 16 15:45:39.642524 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:39.642497 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-frxh5" Apr 16 15:45:40.675050 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:40.675021 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-gx74z_fd05d685-bbeb-4d6a-b14d-ec2b3dd85339/migrator/0.log" Apr 16 15:45:40.697119 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:40.697097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-gx74z_fd05d685-bbeb-4d6a-b14d-ec2b3dd85339/graceful-termination/0.log" Apr 16 15:45:41.027003 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:41.026898 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-x98m6_ff4ddc68-4c66-4144-b764-4cfde96015d7/kube-storage-version-migrator-operator/1.log" Apr 16 15:45:41.028117 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:41.028064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-x98m6_ff4ddc68-4c66-4144-b764-4cfde96015d7/kube-storage-version-migrator-operator/0.log" Apr 16 15:45:42.350787 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.350740 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lb2nq_7df5be5d-8f4e-489e-95da-488d3220a4f7/kube-multus-additional-cni-plugins/0.log" Apr 16 15:45:42.371125 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.371101 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lb2nq_7df5be5d-8f4e-489e-95da-488d3220a4f7/egress-router-binary-copy/0.log" Apr 16 15:45:42.390591 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.390571 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lb2nq_7df5be5d-8f4e-489e-95da-488d3220a4f7/cni-plugins/0.log" Apr 16 15:45:42.410627 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.410600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lb2nq_7df5be5d-8f4e-489e-95da-488d3220a4f7/bond-cni-plugin/0.log" Apr 16 15:45:42.430391 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.430372 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lb2nq_7df5be5d-8f4e-489e-95da-488d3220a4f7/routeoverride-cni/0.log" Apr 16 15:45:42.449980 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.449953 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lb2nq_7df5be5d-8f4e-489e-95da-488d3220a4f7/whereabouts-cni-bincopy/0.log" Apr 16 15:45:42.469509 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.469486 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lb2nq_7df5be5d-8f4e-489e-95da-488d3220a4f7/whereabouts-cni/0.log" Apr 16 15:45:42.510260 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.510233 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q29wv_a9a03c7e-3f17-4a7e-b126-8b8ba1c33c56/kube-multus/0.log" Apr 16 15:45:42.555269 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.555247 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7wx6z_fc2d28ab-f651-462e-ae85-98e9780905b0/network-metrics-daemon/0.log" Apr 16 15:45:42.572170 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:42.572145 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7wx6z_fc2d28ab-f651-462e-ae85-98e9780905b0/kube-rbac-proxy/0.log" Apr 16 15:45:43.973483 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:43.973455 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-controller/0.log" Apr 16 15:45:43.991321 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:43.991288 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/0.log" Apr 16 15:45:44.022040 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:44.022006 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovn-acl-logging/1.log" Apr 16 15:45:44.043425 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:44.043401 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/kube-rbac-proxy-node/0.log" Apr 16 15:45:44.065150 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:44.065120 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:45:44.081519 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:44.081497 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/northd/0.log" Apr 16 15:45:44.100801 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:44.100761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/nbdb/0.log" Apr 16 15:45:44.120494 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:44.120464 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/sbdb/0.log" Apr 16 15:45:44.292684 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:44.292609 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7dqw_8e330e52-07ab-4173-a692-bcf1bedd06ff/ovnkube-controller/0.log" Apr 16 15:45:45.160070 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:45.160042 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-whfzb_2413d25f-8416-4493-bebc-794f33e6f210/check-endpoints/0.log" Apr 16 15:45:45.225118 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:45.225090 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-c78tw_c89f7bd4-8433-4357-856e-4886a97cdf70/network-check-target-container/0.log" Apr 16 15:45:46.018541 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:46.018516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-c9tx6_c119f984-3b17-49d4-8d0b-37669cbcbeb7/iptables-alerter/0.log" Apr 16 15:45:46.656651 ip-10-0-130-140 kubenswrapper[2576]: I0416 15:45:46.656621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vj7zr_0d47b69f-123e-49bb-8517-d2e2716ccea1/tuned/0.log"